multiprocessing eats memory

redbaron ivanov.maxim at gmail.com
Sat Sep 27 18:29:45 EDT 2008


> When processing data in parallel you will use up as muchmemoryas
> many datasets you are processing at any given time.
Worker processes eats 2-4 times more than I pass to them.


>If you need to
> reducememoryuse then you need to start fewer processes and use some
> mechanism to distribute the work on them as they become free. (see
> recommendation that uses Queues)
I don't understand how could I use Queue here? If worker process
finish computing, it puts its' id into Queue, in main process I
retrieve that id and how could I retrieve result from worker process
then?




More information about the Python-list mailing list