[SciPy-User] Problem with handling big matrices with Windows

Sturla Molden sturla.molden at gmail.com
Wed Jun 11 08:57:01 EDT 2014


Antonelli Maria Rosaria <maria-rosaria.antonelli at curie.fr> wrote:
> Thank you. 
> I am going to try with memmap.

The virtual memory space at your disposal is just 2 GB. That includes
allocated RAM and memmap. If you memmap a file you can allocate less memory
from RAM. But you cannot exceed 2 GB in total. That is the upper limit on
32 bit Windows applications.

memmap can be used as "extended memory" if you run out of RAM on 64 bit
systems, because the virtual memory space is astronomic. On a 64 bit system
you can in practice memory map files of any size. This is one of the most
important reasons for prefering 64 bit Python.

memmap has very limited usability 32 bit systems. It is often used for
other purposes than avoiding loading large files into RAM. On ARM (e.g.
Raspberry Pie) you communicate with hardware via memory mapping. So with
numpy.memmap ypu can communicate with any hardware you connect to it. Or
perhaps you have a database file and want random access to the fields
stored in the file (sorting, searching). Then it can be easier to implement
algorithms with mmap than file.read and file.write, because the same
functions can be used (often unchanged) on memmaps and arrays. Or perhaps
you want to use shared memory to share data between processes. You would
then memory map from the paging file (it has fd 0 on Windows and fd -1 on
Linux). But none of this has to do with "array to big to fit in RAM". 

Sturla




More information about the SciPy-User mailing list