Embedded Python and multiprocessing on Windows?

Apple applemask at gmail.com
Thu Aug 9 13:33:58 EDT 2018


A quick update as I found a minor workaround, but I don't quite understand why it works this way or how I would fix it to work differently:

So my program runs one script file, and multiprocessing commands from that script file seem to fail to spawn new processes.

However, if that script file calls a function in a separate script file that it has imported, and that function calls multiprocessing functions, it all works exactly the way it should.

So now I think my question is why does it behave this way, and is there anything I can do to fix the behavior in the first script? It's not the end of the world to need 2 scripts, but it sure would be a lot cleaner with just one.

On Thursday, August 9, 2018 at 12:09:36 PM UTC-4, Apple wrote:
> I've been working on a project involving embedding Python into a Windows application. I've got all of that working fine on the C++ side, but the script side seems to be hitting a dead end with multiprocessing. When my script tries to run the same multiprocessing code that works in a non-embedded environment, the code doesn't appear to be executed at all.
> 
> A StackOverflow thread gave me a few ideas to try out, so I gave it a shot with these lines added:
> 
> sys.argv = ['C:\\path\\to\\script.py']
> mp.set_executable("C:\\Python37\\python.exe")
> p = mp.Process(target=somefunction)
> p.start()
> 
> Still no joy. However, a Python.exe window does pop up for a tenth of a second, so *something* is happening.
> 
> So does multiprocessing actually work in embedded Python, or am I at a dead end here? My guess is that it's probably linked to Windows not having fork, so the combination of being embedded and having to spawn is making things go goofy.




More information about the Python-list mailing list