How To Do It Faster?!?

andrea_gavana at tin.it andrea_gavana at tin.it
Sat Apr 2 07:28:13 EST 2005


Hello Simo & NG,

>Correct me if I'm wrong but since it _seems_ that the listing doesn't
>need to be up-to-date each minute/hour as the users will be looking
>primarily for old/unused files, why not have a daily cronjob on the
>Unix server to produce an appropriate file list on e.g. the root
>directory of your file server?

You are correct. I don't need this list to be updated every minute/hour.

>$ find . -type f -printf "%T@ %u %s %p\n" > /yourserverroot/files.txt

That is a nice idea. I don't know very much about Unix, but I suppose that
on a ksh I can run this command (or a similar one) in order to obtain the
list I need. If anyone knows if that command will run also on a simple ksh,
could please confirm that?

Moreover, I could run this script in a while loop, like:

while 1:
do

    if -e [/yourserverroot/filesbackup.txt];
    then
        find . -type f -printf "%T@ %u %s %p\n" > /yourserverroot/files.txt
        copy /yourserverroot/files.txt   /yourserverroot/filesbackup.txt
    else
        find . -type f -printf "%T@ %u %s %p\n" > /yourserverroot/filesbackup.txt
    fi

done

or something similar (I don't have Unix at hand now, I can not test the
commands and, as I said, I don't know Unix very well...). In this way, I
always have the filesbackup.txt up-to-date, as a function of the "find"
speed on the server.
Then my GUI could scan the filesbackup.txt file and search for a particular
user information. 

Thanks to all the NG for your suggestions!

Andrea.




More information about the Python-list mailing list