Queues - Is Infinity all right?

Anand Pillai pythonguy at Hotpop.com
Mon Oct 6 19:48:26 EDT 2003


This was exactly what my problem was. The producer thread was
creating a lot of data to the queue which the consumer cannot
keep up with. 

Since I am using python threads, it so happens most of the time
that the queue is filled to huge sizes, and the consumer thread
context switch is not done, causing memory problems.

I have not gone through the code of Queue.py, but I think that 
the queue.get() method is an O(n) operation, considering the 
ration of 'put's to the 'get's. 

A related question here is... How do you avoid producer thread 
deadlocking when the consumer queue is full?

If the question is not clear, my queue is of size 'n'. It is already
full with 'n' pieces of data inside it. All of my 'n' producer threads
are trying to push data into it at the *same* time and there is no
consumer thread popping the data off (In my program, the consumers
are the producers themselves, a bad design I agree...). 

This is clearly a deadlock situtation, since no thread will enter the
queue forever. I solved it by creating a new consumer thread to pop
off some data, when deadlock occurs. But sometimes the deadlock
occurs very often resulting in many extra threads being created like this.

The other approach is to write separate out consumer and producer thread
roles so that there is always one consumer thread somewhere which is
popping data off the queue, which is what I have done now.

Anyway how do you make sure that dead lock situations not occur? 
Can you write some Event() based mechanisms to make sure that the 
threads always access the queue at different times? Since python offers
no control over thread priority or mechanisms to suspend a thread, how
else could this be done?

That brings me to a favorite topic, to implement a thread wrapper
over the existing python threading API to allow suspending of threads
and modifying thread priority. Has this been considered as a PEP anytime
before?

Thank you all

-Anand


ykingma at accessforall.nl wrote in message news:<3f818f20$0$14130$e4fe514c at dreader4.news.xs4all.nl>...
> Peter Hansen wrote:
> 
> 
> ...
> > 
> > That said, I'd consider using a fixed queue only when I was greatly
> > concerned about whether my consumer thread was going to be able to
> > keep up with my producer thread(s), and was therefore concerned about
> > overflowing memory with unconsumed input.
> > 
> > In other words, I'd worry more about the robustness of the app
> > than about optimizing it before it worked...
> > 
> 
> I've been bitten a few times by unbounded queues when the consumer
> thread could not keep up: the queue grows to memory limits and
> no robustness is left.
> 
> So I changed my default choice for queue implementations to bounded
> queues. They are much more predictable under unexpected loads
> as they evt. force the producer to wait for the consumer.
> 
> The next question is off course what maximum queue size to use.
> You can get good answers from queuing theory, but 10 to 20
> works well in many cases.
> 
> Kind regards,
> Ype




More information about the Python-list mailing list