Overhead of individual python apps

Mike Meyer mwm at mired.org
Tue Sep 27 18:53:41 EDT 2005


"Qopit" <russandheather at gmail.com> writes:

> When running in Windows, launching each application generates a
> process, and each of those processes ends up taking up > 4MB of system
> memory.  This memory usage is as reported by the Windows Task manager
> for the python.exe image name.

The first step is to clarify what's being reported. If WTM is
reporting the total memory usage for each process, then it's over
estimating the total usage by a considerable amount. In particular,
all the Python executable code should be shared by all the
processes. Unless you load compiled extensions, anyway - those will
only be shared by the ones that use them. You'll need something that
will give you the stack, heap and code segment sizes separately to
work out what the memory usage really is.

> My Question: Is there any way to reduce this per-process overhead?  eg:
> can you set it somehow so that one python.exe instance handles multiple
> processes?

The OS should do that for you by default.

> One possibility considered is to run them as threads of a single
> process rather than multiple processes, but this has other drawbacks
> for my application and I'd rather not,

That shouldn't help memory usage - the data that isn't across
processes would need to be thread-private in any case.

The reason for the uncertainty is that I'm not positive that Windows
behaves sanely in this area. It may be that Windows doesn't have
shared executables. In this case, one solution is to move to a modern
OS - like v6 Unix :-).

   <mike
-- 
Mike Meyer <mwm at mired.org>			http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.



More information about the Python-list mailing list