Speeding up network access: threading?

Jens Müller me4 at privacy.net
Mon Jan 4 11:22:42 EST 2010


Hello,

what would be best practise for speeding up a larger number of http-get 
requests done via urllib? Until now they are made in sequence, each request 
taking up to one second. The results must be merged into a list, while the 
original sequence needs not to be kept.

I think speed could be improved by parallizing. One could use multiple 
threads.
Are there any python best practises, or even existing modules, for creating 
and handling a task queue with a fixed number of concurrent threads?

Thanks and regards! 




More information about the Python-list mailing list