Speeding up network access: threading?

Terry Reedy tjreedy at udel.edu
Mon Jan 4 12:17:46 EST 2010


On 1/4/2010 11:22 AM, Jens Müller wrote:
> Hello,
>
> what would be best practise for speeding up a larger number of http-get
> requests done via urllib? Until now they are made in sequence, each
> request taking up to one second. The results must be merged into a list,
> while the original sequence needs not to be kept.
>
> I think speed could be improved by parallizing. One could use multiple
> threads.
> Are there any python best practises, or even existing modules, for
> creating and handling a task queue with a fixed number of concurrent
> threads?

I believe code of this type has been published here in various threads. 
The fairly obvious thing to do is use a queue.queue for tasks and 
another for results and a pool of threads that read, fetch, and write.





More information about the Python-list mailing list