[Numpy-discussion] Loading a > GB file into array

Bryan Cole bryan at cole.uklinux.net
Fri Nov 30 14:51:23 EST 2007


> 
> Well, one thing you could do is dump your data into a PyTables_
> ``CArray`` dataset, which you may afterwards access as if its was a
> NumPy array to get slices which are actually NumPy arrays.  PyTables
> datasets have no problem in working with datasets exceeding memory size.
> For instance::

I've recently started using PyTables for storing large datasets and I'd
give it 10/10! Access is fast enough you can just access the data you
need and leave the full array on disk. 

BC





More information about the NumPy-Discussion mailing list