[SciPy-User] memoryError when i have plenty of available ram

David Cournapeau david at ar.media.kyoto-u.ac.jp
Wed Sep 30 23:01:43 EDT 2009


David Goldsmith wrote:
> On Wed, Sep 30, 2009 at 6:25 PM, David Cournapeau <cournape at gmail.com
> <mailto:cournape at gmail.com>> wrote:
>
>     On Thu, Oct 1, 2009 at 3:58 AM, Gustaf Nilsson
>     <gustaf at laserpanda.com <mailto:gustaf at laserpanda.com>> wrote:
>     > Hiya
>     >
>     > I know someone just started a memory thread, but i didnt wanna
>     hijack it..
>     > My image processing app that im working on seems to crash with
>     "memoryError"
>     > when it hits about 1.1gb of mem usage (same on two computers;
>     has 2/4gb ram,
>     > xp 32bit)
>
>     If possible, a small script which reproduces the problem would be
>     helpful.
>
>     Keep in mind that on windows, by default, your python script cannot
>     use more than 2 Gb anyway, even if you have 4Gb of memory.
>
>
> Interesting.  Is this true in Vista?  Windows 7? 

It is true for (at least) most OSes, actually, and a limitation of 32
bits addressing. The only workaround is to use several processes. The
origin is that a process cannot 'see' more than 4 Gb in 32 bits, and
part of it has to be reserved for the kernel - windows and linux by
default limit the virtual adressing to 2 Gb per process in the userland.
There are options to split between 3 Gb user /1Gb kernel or the contrary
in linux, and similar in windows.

There is this pretty good explanation here for linux for the gory
details: http://kerneltrap.org/node/2450 (I would be surprised if
windows kernel was fundamentally different - except for the fork thing
of course).

The true solution is to use a 64 bits OS.

cheers,

David



More information about the SciPy-User mailing list