[capi-sig] Embedded Python in C application

Adam Olsen rhamph at gmail.com
Sat Sep 27 08:02:34 CEST 2008


On Fri, Sep 26, 2008 at 11:24 PM, Swapnil Talekar <swapnil.st at gmail.com> wrote:
> On Sat, Sep 27, 2008 at 3:02 AM, Adam Olsen <rhamph at gmail.com> wrote:
>
>> On Fri, Sep 26, 2008 at 9:16 AM, Eljay Love-Jensen <eljay at adobe.com>
>> wrote:
>> > Hi everyone,
>> >
>> > First, my apologies if I'm in the wrong forum for my "embedding Python in
>> a
>> > C application" questions.  Please redirect me if I've wandered into the
>> > wrong place.
>> >
>> > I have two needs for using Python in my application that I hope has an
>> easy
>> > answer without rewriting Python's internals.
>> >
>> > I need to use Python* in a multi-threaded application, where separate
>> > threads may be working on very long lasting Python scripts, and other
>> > threads may be involved in short Python scripts.  None of the Python
>> scripts
>> > running concurrently have any shared state with any of the other Python
>> > scripts running concurrently.  Number of threads is in the 100-1000
>> range.
>> >
>> > I need to manage Python's use of the heap by providing a memory pool for
>> > Python to use, rather than allowing Python to use malloc/free.  This is
>> to
>> > prevent memory fragmentation, and to allow easy disposal of a memory pool
>> > used for a closed Python interpreter instance.
>> >
>> > A quick view of Py_Initialize() indicates that Python does not return
>> some
>> > sort of "Py_State" pointer which represents the entire state of a Python
>> > interpreter.  (Nor some sort of Py_Alloc().)  Nor accepts a custom
>> > malloc/free function pointers.  Hmmm.
>> >
>> > Does anyone have experience with using Python in this fashion?
>>
>>           >Don't use multiple interpreters.  They're not really separate,
>> they're
>>           >buggy, they offer *NO* advantage to you over just using multiple
>>           >threads.
>>
>
>>they're buggy? sure. they'r not really separate? well, now if you want
>>to have multiple threads running scripts, I don't see how you can get away
>>without having multiple interpreters (in the same process)and they
>>REALLY have to be separate. That's not a easy task though. As I said,
>>the separation has to be more than just separate PyInterpreterStates

You must not be very familiar with threading.  All you need to do is
give each script it's own *local* state and not modify any globals.
No need for multiple interpreters.

All the subinterpreter API does is give each interpreter a separate
copy of the modules, so poorly designed APIs that use global state can
pretend they've got separate processes, without actually having
separate processes.  Rather obscure, and not useful for the OP.


>> Likewise, you can't force memory to be freed, as it'd still be used by
>> python.
>>
>> The only way to force cleanup is to spawn a subprocess.  This'd also
>> let you use multiple cores.  You can probably mitigate the startup
>> cost by having a given subprocess run several short scripts or one
>> long script.
>
>
>>Well, if you have your own memory manager, i.e. other than
>>Python's and you are embedding the Interpreter in your application.
>>I don't see any reason why you should not be able to cleanup at any
>>appropriate point you think. Python is still using the memory? sure it is.
>>But not after its done with the script

If you free any memory like that you'll have hosed the entire python
interpreter.  After that there's nothing useful to do but exit the
process.. so you might as well exit in the first place.


-- 
Adam Olsen, aka Rhamphoryncus


More information about the capi-sig mailing list