get the size of a dynamically changing file fast ?
Jason
tenax.raccoon at gmail.com
Wed Jan 23 12:24:49 EST 2008
On Jan 22, 3:22 pm, Stef Mientki <stef.mien... at gmail.com> wrote:
> Mike Driscoll wrote:
> > On Jan 22, 3:35 pm, Stef Mientki <stef.mien... at gmail.com> wrote:
>
> >> Mike Driscoll wrote:
>
> >>> On Jan 17, 3:56 pm, Stef Mientki <stef.mien... at gmail.com> wrote:
>
> >>>> hello,
>
> >>>> I've a program (not written in Python) that generates a few thousands
> >>>> bytes per second,
> >>>> these files are dumped in 2 buffers (files), at in interval time of 50 msec,
> >>>> the files can be read by another program, to do further processing.
>
> >>>> A program written in VB or delphi can handle the data in the 2 buffers
> >>>> perfectly.
> >>>> Sometimes Python is also able to process the data correctly,
> >>>> but often it can't :-(
>
> >>>> I keep one of the files open en test the size of the open datafile each
> >>>> 50 msec.
> >>>> I have tried
> >>>> os.stat ( ....) [ ST_SIZE]
> >>>> os.path.getsize ( ... )
> >>>> but they both have the same behaviour, sometimes it works, and the data
> >>>> is collected each 50 .. 100 msec,
> >>>> sometimes 1 .. 1.5 seconds is needed to detect a change in filesize.
>
> >>>> I'm using python 2.4 on winXP.
>
> >>>> Is there a solution for this problem ?
>
> >>>> thanks,
> >>>> Stef Mientki
>
> >>> Tim Golden has a method to watch for changes in a directory on his
> >>> website:
>
> >>>http://tgolden.sc.sabren.com/python/win32_how_do_i/watch_directory_fo...
>
> >>> This old post also mentions something similar:
>
> >>>http://mail.python.org/pipermail/python-list/2007-October/463065.html
>
> >>> And here's a cookbook recipe that claims to do it as well using
> >>> decorators:
>
> >>>http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/426620
>
> >>> Hopefully that will get you going.
>
> >>> Mike
>
> >> thanks Mike,
> >> sorry for the late reaction.
> >> I've it working perfect now.
> >> After all, os.stat works perfectly well,
> >> the problem was in the program that generated the file with increasing
> >> size,
> >> by truncating it after each block write, it apperently garantees that
> >> the file is flushed to disk and all problems are solved.
>
> >> cheers,
> >> Stef Mientki
>
> > I almost asked if you were making sure you had flushed the data to the
> > file...oh well.
>
> Yes, that's a small disadavantage of using a "high-level" language,
> where there's no flush available, and you assume it'll done
> automatically ;-)
>
> cheers,
> Stef
Uhm, there is a flush method for Python's files. From "http://
docs.python.org/lib/bltin-file-objects.html":
flush()
Flush the internal buffer, like stdio's fflush(). This may
be a no-op on some file-like objects.
As for an example:
>>> import os
>>> f = open('vikings.txt', 'wb')
>>> os.stat('vikings.txt').st_size
0L
>>> f.write('Spam, spam, spam, spam! ' * 1000) # Bloody vikings...
>>> os.stat('vikings.txt').st_size
24576L
>>> f.flush()
>>> os.stat('vikings.txt').st_size
25000L
>>>
Is there something that I'm missing here?
--Jason
More information about the Python-list
mailing list