Reading a large csv file

Mag Gam magawake at gmail.com
Mon Jun 22 23:17:22 EDT 2009


Hello All,

I have a very large csv file 14G and I am planning to move all of my
data to hdf5. I am using h5py to load the data. The biggest problem I
am having is, I am putting the entire file into memory and then
creating a dataset from it. This is very inefficient and it takes over
4 hours to create the hdf5 file.

The csv file has various types:
int4, int4, str, str, str, str, str

I was wondering if anyone knows of any techniques to load this file faster?

TIA



More information about the Python-list mailing list