[Python-ideas] `__iter__` for queues?

MRAB python at mrabarnett.plus.com
Wed Jan 20 04:47:15 CET 2010


Raymond Hettinger wrote:
> On Jan 19, 2010, at 6:06 PM, Terry Reedy wrote:
> 
>> On 1/19/2010 4:20 PM, cool-RR wrote:
>>> For me, iterating on the queue means just calling `get`
>>> repeatedly until it's empty.
> 
> If that is all you're looking for, then just do that.  Write a short
> generator to dump the queue.  You can do that without blocking your
> producer thread.
> 
> The problem with proposing an change to the API for Queue.Queue and
> MultiProcessing.Queue is that it presents non-obvious pitfalls for
> other common use cases.  It is better to leave it out of the API and
> let users explicitly craft solutions that fit their problem (perhaps
> using two queues:  Producer --->  DatabaseUpdater --> Consumer or
> some solution involving locking).  For many uses of Queue, it doesn't
> make sense to block a producer for the time it takes to iterate, nor
> does it make sense to have a non-blocking iterator (leading to a race
> condition and inconsistent results).
> 
> It is better to use the API as designed than to propose a new method
> than may be unsuitable for many use cases.
> 
> 
>> So iterating with an iterator is not quite the same as repeated
>> one-item fetches.
>> 
> 
> 
> Right.  The two ideas should not be conflated.  A Queue object is all
> about buffering producers and consumers running in different threads.
> Anything API that is at odds with that notion is probably not a good
> idea (i.e. iterating over a dynamically updating queue buffer).
> 
I was thinking of methods like these, for example:

def close(self):
     self.put(sentinel)
def __iter__(self):
     while True:
         item = self.get()
         if item is sentinel:
             break
         yield item
     raise StopIteration

How would they not be threadsafe, or block a producer?




More information about the Python-ideas mailing list