How To Do It Faster?!?

Simo Melenius firstname.lastname at iki.fi-spam
Sat Apr 2 01:35:02 EST 2005


andrea_gavana at tin.it writes:

> Every user of thsi big directory works on big studies regarding oil
> fields. Knowing the amount of data (and number of files) we have to
> deal with (produced by simulators, visualization tools, and so on)
> and knowing that users are usually lazy in doing clean up of
> unused/old files, this is a way for one of us to "fast" scan all the
> directories and identify which files belong to him. Having them in
> an organized, size-sorted wxPython list, the user can decide if he
> want to delete some files (that almost surely he forgot even that
> they exist...) or not. It is easy as a button click (retrieve the
> data-->delete the files).

Correct me if I'm wrong but since it _seems_ that the listing doesn't
need to be up-to-date each minute/hour as the users will be looking
primarily for old/unused files, why not have a daily cronjob on the
Unix server to produce an appropriate file list on e.g. the root
directory of your file server?

Your Python client would then load that (possibly compressed) text
file from the network share and find the needed bits in there.

Note that if some "old/unneeded" files are missing today, they'll show
right up the following day.

For example, running the GNU find command like this:

$ find . -type f -printf "%T@ %u %s %p\n" > /yourserverroot/files.txt

produces a file where each line contains the last modified time,
username, size and path for one file. Dead easy to parse with Python,
and you'll only have to set up the cronjob _once_ on the Unix server.

(If the file becomes too big, grep can be additionally used to split
the file e.g. per each user.)


br,
S

-- 
firstname.lastname at iki.fi-spam



More information about the Python-list mailing list