Multiple independently started python processes and sharing of a module

Kushal Kumaran kushal.kumaran+python at gmail.com
Fri Jan 14 05:05:25 EST 2011


On Fri, Jan 14, 2011 at 1:51 PM, Martin P. Hellwig
<martin.hellwig at dcuktec.org> wrote:
> On 01/14/11 03:04, Kushal Kumaran wrote:
>>
>> ----- Original message -----
>>>
>>> Hi all,
>>>
>>> I have the following problem (which I already have a hacked around
>>> solution that works but I'd would like some more input on it):
>>>
>>> I have a situation where multiple python processes are started
>>> independently from each other but by the same user with the same
>>> environment (as happens with mod_wsgi, when not using daemon mode).
>>>
>>> All of these processes access a single module which needs
>>> synchronization for some of the commands, for example a db (MySQLdb)
>>> module where when a select is done, the fetchall must be done of that
>>> same process before another process can do anything else.
>>>
>>
>> If the processes are independent, they are not sharing the database
>> connection, unless you've taken steps to make it so.  MySQLdb imported in
>> one process should not interfere with MySQLdb importerd in another process.
>>
>>> <snip>
>>
> It might be a misconfiguration but, under mod_wsgi with apache it does.
>

Ah, I didn't notice the mod_wsgi reference.  I'm out of my depth here.
 Hopefully someone with mod_wsgi experience will chime in.

This might help though:
https://code.google.com/p/modwsgi/wiki/ProcessesAndThreading

It seems if you're not using 'daemon' mode, global data might be shared.

You could create new connections for each request (and close them when
done).  There won't be interference between select/fetch across
multiple database connections.  Additionally, the documentation of
MySQLdb says it is a bad idea to share database connections between
threads.

-- 
regards,
kushal



More information about the Python-list mailing list