[Numpy-discussion] Loading a > GB file into array

Hans Meine meine at informatik.uni-hamburg.de
Sat Dec 1 12:44:08 EST 2007


On Samstag 01 Dezember 2007, Martin Spacek wrote:
> Kurt Smith wrote:
>  > You might try numpy.memmap -- others have had success with it for
>  > large files (32 bit should be able to handle a 1.3 GB file, AFAIK).
>
> Yeah, I looked into numpy.memmap. Two issues with that. I need to
> eliminate as much disk access as possible while my app is running. I'm
> displaying stimuli on a screen at 200Hz, so I have up to 5ms for each
> movie frame to load before it's too late and it drops a frame. I'm sort
> of faking a realtime OS on windows by setting the process priority
> really high. Disk access in the middle of that causes frames to drop. So
> I need to load the whole file into physical RAM, although it need not be
> contiguous. memmap doesn't do that, it loads on the fly as you index
> into the array, which drops frames, so that doesn't work for me.

Sounds as if using memmap and then copying each frame into a separate 
in-memory ndarray could help?

Ciao, /  /                                                    .o.
     /--/                                                     ..o
    /  / ANS                                                  ooo
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part.
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20071201/d3cdb96c/attachment.sig>


More information about the NumPy-Discussion mailing list