Multiple, separated interpreters in a single process

Tobias Oberstein Tobias.Oberstein at gmx.de
Thu Feb 6 12:39:08 EST 2003


.. effectively, regarding above topic, I was thinking along the
following lines (is this a way to go?):


struct _wrld;
typedef struct _wrld {

   /* GIL is global within this world */
   PyThread_type_lock interpreter_lock;
   int initialized;

   ...
} PyWorld;



PyWorld* Py_Initialize(PyWorld *world)

PyThreadState* Py_NewInterpreter(PyWorld *world)

void Py_EndInterpreter(PyWorld *world, PyThreadState *tstate)

void Py_Finalize(PyWorld *world)



#define Py_BEGIN_ALLOW_THREADS(world) { \
			PyThreadState *_save; \
			_save = PyEval_SaveThread(world);
#define Py_BLOCK_THREADS(world)	PyEval_RestoreThread(world,_save);
#define Py_UNBLOCK_THREADS(world)	_save = PyEval_SaveThread(world);
#define Py_END_ALLOW_THREADS(world)	PyEval_RestoreThread(world,_save); \
		 }



> -----Ursprüngliche Nachricht-----
> Von: Tobias Oberstein [mailto:Tobias.Oberstein at gmx.de]
> Gesendet: Donnerstag, 6. Februar 2003 17:10
> An: python-list at python.org
> Betreff: Multiple, separated interpreters in a single process
>
>
> I know the issue was there previously, but I couldn't find a
> definitive and current answer. So maybe someone could clarify?
>
> I'd like to have multiple interpreters within a single process
> such that the interpreters are competely separated (like in TCL):
>
> - different object spaces
> - different GILs
>
> I cannot go with namespaces or interpreters in different
> processes, because though I don't need to share Python object
> spaces directly (which would obviously require fine-grained
> locking with all it's performance hits), I need to share the
> application context (OODBMS) of the multithreaded application
> that embeds Python. The application context is wrapped up in
> Python extension classes, which take care of the necessary
> synchronisation and locking.
>
> As far as I could grasp from looking at the Python sources this
> _is_ a problem since the Python interpreter makes extensive use
> of global static data. Right?
>
> Now, how much would it take to get there? Would the following
> "systematic" approach work?
>
> 1. identify all global static data in the sources
>
> 2. collect them all in one data structure, e.g. interpreter_state
>
> 3. identify all functions that reference global static data
>
> 4. change all those functions to take one additional parameter
>    (*interpreter_state) and adapt function bodies to reference
>    the data within the interpreter_state structure
>
> 5. add wrapper functions taking original parameters (without
>    *interpreter_state) and let them call the new one's with
>    a global static *interpreter_state for reasons of backward
>    compat.
>
> Maybe I'm naive .. above recipe is straightforward and I'm sure
> there are problems. What would be left, if above would have be done?
>
> Greets,
> Tobias
>






More information about the Python-list mailing list