fastest way to read a text file in to a numpy array

Heli hemla21 at gmail.com
Thu Jun 30 11:49:31 EDT 2016


Dear all, 

After a few tests, I think I will need to correct a bit my question. I will give an example here. 

I have file 1 with 250 lines:
X1,Y1,Z1
X2,Y2,Z2
....

Then I have file 2 with 3M lines:
X1,Y1,Z1,value11,value12, value13,....
X2,Y2,Z2,value21,value22, value23,...
....

I will need to interpolate values for the coordinates on file 1 from file 2. (using nearest) 
I am using the scipy.griddata for this.  

scipy.interpolate.griddata(points, values, xi, method='linear', fill_value=nan, rescale=False)

When slicing the code, reading files in to numpy is not the culprit, but the griddata is. 

time to read file2= 2 min
time to interpolate= 48 min

I need to repeat the griddata above to get interpolation for each of the column of values. I was wondering if there are any ways to improve the time spent in interpolation. 


Thank you very much in advance for your help, 





More information about the Python-list mailing list