[SciPy-User] Strange memory limits

Christoph Gohlke cgohlke at uci.edu
Tue Mar 29 13:39:35 EDT 2011



On 3/29/2011 10:31 AM, Chris Weisiger wrote:
> Thanks for the information, all. Annoyingly, I can't allocate 3
> individual 735x512x512 arrays (each ~360MB) -- the third gives a memory
> error. If I allocate 1024x1024 byte arrays (thus, each 1MB), I can make
> 1500 before getting a memory error. So I'm definitely running into
> *some* issue that prevents larger blocks from being allocated, but I'm
> also hitting a ceiling well before I should be.
>
> I had thought that Windows allowed for 3GB address spaces for 32-bit
> processes, but apparently (per
> http://msdn.microsoft.com/en-us/library/aa366778%28v=vs.85%29.aspx#memory_limits
> ) that only applies if the program has IMAGE_FILE_LARGE_ADDRESS_AWARE
> and 4GT set...sounds like I'd need a recompile and some system tweaks to
> set those. The proper, and more work-intensive, solution would be to
> make a 64-bit build.
>
> My (admittedly limited) understanding of memory fragmentation was that
> it's a per-process problem. I'm seeing this issue immediately on
> starting up the program, so the program's virtual memory address space
> should be pretty clean.
>
> -Chris
>
> On Tue, Mar 29, 2011 at 10:14 AM, Christopher Barker
> <Chris.Barker at noaa.gov <mailto:Chris.Barker at noaa.gov>> wrote:
>
>     On 3/28/11 4:51 PM, David Baddeley wrote:
>     >  what you're probably running into is a problem with allocating a
>     >  continuous block of memory / a memory fragmentation issue.
>
>     On 3/28/11 8:01 PM, Charles R Harris wrote:
>
>     >  Windows 32 bit gives you 2GB and keeps the rest for itself.
>
>     right -- I've had no problems running the same code with 32 bit python
>     on OS-X that crashes out with memory errors on Windows -- similar
>     hardware.
>
>     >  Compiling as 64 bit might solve your problem, as, with 12 GB of
>     memory,
>     >  there will be a larger address space to look for contiguous blocks in,
>     >  but probably doesn't address the fundamental issue.
>
>     Ah, but while you still may only have 12GB memory, with 64 bit Windows
>     and Python, the virtual memory space is massive, so I suspect you'll be
>     fine. Using 1GB memory buffers on 32bit is certainly pushing it.
>
>     >  I suspect you could
>     >  probably get away with having much smaller contiguous blocks (eg
>     have 3
>     >  separate arrays for the 3 different cameras) or even a new array for
>     >  each image.
>
>     That would make it easier for the OS to manage the memory well.
>
>     -Chris
>
>
>     --
>     Christopher Barker, Ph.D.
>     Oceanographer
>
>     Emergency Response Division
>     NOAA/NOS/OR&R            (206) 526-6959   voice
>     7600 Sand Point Way NE   (206) 526-6329   fax
>     Seattle, WA  98115       (206) 526-6317   main reception
>
>     Chris.Barker at noaa.gov <mailto:Chris.Barker at noaa.gov>


Try VMMap 
<http://technet.microsoft.com/en-us/sysinternals/dd535533.aspx>. The 
software lists, among other useful information, the sizes of contiguous 
blocks of memory available to a process. You'll probably find that 64 
bit Python lets you use a much larger contiguous block than 32 bit Python.

It could help to create large numpy arrays early in the program, e.g. 
before importing packages or creating other arrays.

Christoph



More information about the SciPy-User mailing list