how to calculate correctly the cluster size

Josiah Carlson jcarlson at uci.edu
Wed Apr 21 00:13:29 EDT 2004


> 	These are all FAT32 partitions. Don't know how to, if possible,
> change clusters on my XP laptop using NTFS.
> 
> 	While I used Partition Magic to do the partitioning, the sizes
> are the smallest cluster size possible in the partition size. One reason
> I have so many partitions: 

The Windows 2k "Disk Administrator" software for 2K always uses 4k 
cluster sizes by default.  I believe your varied cluster sizes are the 
result of using Partition Magic to create them.


> 	It took forever to defrag a 300GB (two partitions) firewire
> drive under NTFS (this drive is used with my XP laptop for miniDV
> captures and editing). I actually had to double the memory of my laptop
> (now 768 MB) before the defrag could run to completion.

There exists an algorithm for defragmenting a drive that only needs to 
read and write the entire drive twice (I wrote one for a database 
defragmenter), and there likely exists one that reads and writes even less.

If you have a 150 gig drive (and it is filled), your computer will need 
to read and write a around 600 gigs (read, write, each twice, 150 gigs). 
  Even if your drive is fast, like say 30 megs/second (probably on the 
high-end for defragmenting), 600,000/30/3600 ~ 5.5 hours.  In reality, 
you're probably getting closer to 5-15 megs/second during a defragment, 
which would give you 11-33 hours to defrag each of your 150 gig partitions.

It's the whole capacity vs bandwidth issue on hard drives, which is 
similar to the bandwidth vs latency issue with RAM.  Ahh, technology.

  - Josiah



More information about the Python-list mailing list