Handling signals and performance issues.
sragsdale at my-deja.com
sragsdale at my-deja.com
Wed Dec 13 13:20:46 EST 2000
In article <meRU5.211$Mx.12164 at sjc-read.news.verio.net>,
rayvd at nospam.firetail.org (Ray Van Dolson) wrote:
> I've written a small Python script that watches a logfile (kinda like
tail -
> f) and splits it into certain other files in real time. I'm running
into
> two issues however that I haven't quite resolved yet to get this
script
> running as I'd like it. Firstly, the program uses up a lot of CPU
time.
> This is almost certainly because of the way I'm watching the logfiles
for
> changes. Basically it's like this:
>
> while 1:
> ln=inputLog.readline()
> if ln:
> processLine(ln)
> else:
> pass
>
> Much of the time of course there have been no changes to the logfile
and so
> it keeps looping, passing and waiting for a change. While the system
> actually remains fast, watching 'top' shows that python is using up
all the
> cpu when it can. Is there a better way to watch a file for changes
that
> doesn' use such a cpu consuming while loop?
Instead of looking into 'tail -f' to see how it does it, why not just
use it yourself? The following program watches a logfile and prints out
the mirror image of the file, and it uses very minimal resources.
Useless, but it'll give you a starting point.
########################################33
import string,os
file = "logfile.txt"
fd = os.popen('tail -f '+file)
line = fd.readline()
while (line):
line = list(line[:-1])
line.reverse()
line = string.join(line,'')
print line
line = fd.readline()
Sent via Deja.com
http://www.deja.com/
More information about the Python-list
mailing list