Possible File iteration bug

Billy Mays noway at nohow.com
Fri Jul 15 08:52:37 EDT 2011


On 07/15/2011 08:39 AM, Thomas Rachel wrote:
> Am 14.07.2011 21:46 schrieb Billy Mays:
>> I noticed that if a file is being continuously written to, the file
>> generator does not notice it:
>
> Yes. That's why there were alternative suggestions in your last thread
> "How to write a file generator".
>
> To repeat mine: an object which is not an iterator, but an iterable.
>
> class Follower(object):
> def __init__(self, file):
> self.file = file
> def __iter__(self):
> while True:
> l = self.file.readline()
> if not l: return
> yield l
>
> if __name__ == '__main__':
> import time
> f = Follower(open("/var/log/messages"))
> while True:
> for i in f: print i,
> print "all read, waiting..."
> time.sleep(4)
>
> Here, you iterate over the object until it is exhausted, but you can
> iterate again to get the next entries.
>
> The difference to the file as iterator is, as you have noticed, that
> once an iterator is exhausted, it will be so forever.
>
> But if you have an iterable, like the Follower above, you can reuse it
> as you want.


I did see it, but it feels less pythonic than using a generator.  I did 
end up using an extra class to get more data from the file, but it seems 
like overhead.  Also, in the python docs, file.next() mentions there 
being a performance gain for using the file generator (iterator?) over 
the readline function.

Really what would be useful is some sort of PauseIteration Exception 
which doesn't close the generator when raised, but indicates to the 
looping header that there is no more data for now.

--
Bill





More information about the Python-list mailing list