[perl-python] a program to delete duplicate files

Christos TZOTZIOY Georgiou tzot at sil-tec.gr
Thu Mar 10 12:02:57 EST 2005


On Wed, 9 Mar 2005 16:13:20 -0600, rumours say that Terry Hancock
<hancock at anansispaceworks.com> might have written:

>For anyone interested in responding to the above, a starting
>place might be this maintenance script I wrote for my own use.  I don't
>think it exactly matches the spec, but it addresses the problem.  I wrote
>this to clean up a large tree of image files once.  The exact behavior
>described requires the '--exec="ls %s"' option as mentioned in the help.

The drawback of this method is that you have to read everything.  For example,
if you have ten files less than 100KiB each and one file more than 2 GiB in
size, there is no need to read the 2 GiB file, is there?

If it's a one-shot attempt, I guess it won't mind a lot.

On POSIX filesystems, one has also to avoid comparing files having same (st_dev,
st_inum), because you know that they are the same file.
-- 
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...



More information about the Python-list mailing list