3/01/2009

03-01-09 - Clean Up

It does actually help to clean up your machine from time to time. One issue is that NTFS dir listing on large dirs is really slow, so huge dirs full of temp files makes dir listing chug. Another issue is that I believe Firefox has an N^2 kind of bug with temporary files where it eventually becomes outrageously slow.

Anyhoo, cleaning up is also a very good idea before you do a defrag. It's silly to defrag with a huge mess of temporary files when it would be better just to kill them and make them all from scratch afterward.

This is the kind of thing that you probably only need to do like once a year. BTW not a bad idea to do before backing up your machine since it's kind of silly to back up a bunch of temp files. (this stuff was literally 20% of the bytes on my disk - 10 GB out of 50). It's a little bit of a catch-22 because of course you'd like to back up *before* you run around deleting lots of junk. More on backups later..

Here's my cleanup routine :

On every drive :


call dele *.chk
call dele ffast*
call dele -y recycled

And try to get the temp dirs :


call zdel -y -r  %TEMP%\*
call zdel -y -r  %TMP%\*
call zdel -y -r "%USERPROFILE%\local settings\temp\*"
call zdel -y -r "%USERPROFILE%\local settings\temporary internet files\*"

Then go to your source code and wipe out the crud there :


zdel -r "*.(ncb|sbr|pch|bak|pdb|ilk|idb|opt|plg|bsc|obj|suo|log_prev)"

My god the ncb's and the suo's and all that stuff are a huge waste of space. Note I have "obj" and "pdb" in there - you might not want to delete those. I do. If I can't rebuild the obj I figure I don't care about that code any more.

Oh but there's more. Lots of programs keep their own huge temp dirs. (there was one really bad culprit of this and now I forget what it was; damn). One is "Visual Assist".


zdel -r "C:\Program Files\Visual Assist.NET\vc7\cache\*"
zdel -r "C:\Program Files\Visual Assist.NET\vc7\history\*"

I assume you run a P4 server or something at home, so go ahead and checkpoint your journal too :


p4d -jc -z

And I don't have a good non-GUI way to do this, but we've missed some shite from Firefox, so use the GUI to delete all history and temporary files.

And finally we've still missed a ton of garbage in "Documents and Settings" but it's too much of a pain and too much of a risk to run around deleting more, so that's it. NOW we can defrag.


After all that I backed up my lappy. It took 6 bloody hours. I have a 60 GB disk in lappy and it can do nearly 60 MB/sec. That means it should take 1000 seconds. 17 minutes. To just copy the whole disk. Sigh.

I've tried a few backup programs in the day and they all have this severe order-of-magnitude slowness problem because they use the damn file system which is butt slow rather than just mirroring the whole disk. Yes I know there are programs to just do a full disk mirror but from what I can tell, none of them let you then treat the results as just a bunch of files you can browse into.

I never want a backup system that creates some custom file format on the backup server - I just want files. I never want to have to do a "recovery" - I just want to go browse to the backups and find the old files that I backed up and copy them out manually.

What a backup program should do : 1. Read the whole disk as a raw image so it's super fast. 2. Compress the disk with a fast LZP or something in memory as it goes. (compression here is not really to save space - it should actually make things faster because compressing is faster than disk write speed). 3. Write the compressed image to the server. 4. Provide an explorer plugin so that you can browse into the stored image as it was just a dir and copy files out.

The whole thing should take maybe 30 minutes max.

Oh, while I'm at it - HYACHACHCA - "robocopy" and "xcopy" both have huge terrible bugs. There are certain file names that they can enumerate in their FindFirst/FindNext which they then fail to find when they actually try to copy the files. I've written before about the disaster of file system names on Windows but this is ridiculous. Also robocopy has this retry-wait thing, which is nice if it's because of a file being in use or something, but when it's a hard error like "file not found" it's dumb to spin in retry-wait for 30 minutes. And there's no key you can press to just make it give up and skip that file, you have to ctrl-C and start over at which time you just give up and use your own copy prompt because nobody else seems capable of writing the simplest piece of software.

1 comment:

  1. I think Acronis TrueImage offers an explorer plugin that you can use to browse through their archive format. I'm not sure if it allows easy access to the files or if it's just as inconvenient as the .zip file plugin that is provided by Windows Explorer, though.

    ReplyDelete