[Numpy-discussion] saving incrementally numpy arrays

Kim Hansen slaunger at gmail.com
Tue Aug 11 02:57:40 EDT 2009


I have had some resembling challenges in my work, and here appending the
nympy arrays to HDF5 files using PyTables has been the solution for me -
that used in combination with lzo compression/decompression has lead to very
high read/write performance in my application with low memory consumption.
You may also want to have a look at the h5py package.
Kim

2009/8/11 Juan Fiol <fiolj at yahoo.com>

> Hi, I am creating numpy arrays in chunks and I want to save the chunks
> while my program creates them. I tried to use numpy.save but it failed
> (because it is not intended to append data). I'd like to know what is, in
> your opinion, the best way to go. I will put a few thousands every time but
> building up a file of several Gbytes. I do not want to put into memory
> all previous data each time. Also I cannot wait until the program finishes,
> I must save partial results periodically. Thanks, any help will be
> appreciated
> Juan
>
>
>
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20090811/65c8fec7/attachment.html>


More information about the NumPy-Discussion mailing list