Multiple independently started python processes and sharing of a module

Aahz aahz at pythoncraft.com
Sun Feb 6 09:53:20 EST 2011


In article <igo5a0$r48$1 at news.eternal-september.org>,
Martin P. Hellwig <martin.hellwig at dcuktec.org> wrote:
>
>Currently my solution is to wrap the module around a module that when 
>used creates a directory and pipes to the process 
>(multiprocessing.Connection) thus enforcing single access and within 
>that I have wrapped the db function around again so that select 
>statement as mentioned above is actually an execute followed by a fetchall.
>
>I still have the nagging feeling that I have reinvented a squared wheel 
>or am totally missing the point.

What you want is a worker pool or connection pool.  I.e. you keep around
multiple open connections and assign them per request.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"Programming language design is not a rational science. Most reasoning
about it is at best rationalization of gut feelings, and at worst plain
wrong."  --GvR, python-ideas, 2009-03-01



More information about the Python-list mailing list