multythreading app memory consumption

Roman Petrichev tier at inbox.ru
Sun Oct 22 19:31:28 EDT 2006


Hi folks.
I've just faced with very nasty memory consumption problem.
I have a multythreaded app with 150 threads which use the only and the 
same function - through urllib2 it just gets the web page's html code 
and assigns it to local variable. On the next turn the variable is 
overritten with another page's code. At every moment the summary of 
values of the variables containig code is not more than 15Mb (I've just 
invented a tricky way to measure this). But during the first 30 minutes 
all the system memory (512Mb) is consumed and 'MemoryError's is arising.
Why is it so and how can I limit the memory consumption in borders, say, 
400Mb? Maybe there is a memory leak there?
Thnx

The test app code:


Q = Queue.Queue()
for i in rez: #rez length - 5000
     Q.put(i)


def checker():
     while True:
         try:
             url = Q.get()
         except Queue.Empty:
             break
         try:
             opener = urllib2.urlopen(url)
             data = opener.read()
             opener.close()
         except:
             sys.stderr.write('ERROR: %s\n' % traceback.format_exc())
             try:
                 opener.close()
             except:
                 pass
             continue
         print len(data)


for i in xrange(150):
     new_thread = threading.Thread(target=checker)
     new_thread.start()



More information about the Python-list mailing list