[SciPy-User] [SciPy-user] Problem with np.load() on Huge Sparse Matrix

Ryan R. Rosario uclamathguy at gmail.com
Fri Jun 4 02:39:28 EDT 2010


Is this a bug? Has anybody else experienced this?

Not being able to load a matrix from disk is a huge limitation for me. I
would appreciate any help anyone can provide with this.

Thanks,
Ryan



Ryan R. Rosario wrote:
> 
> Hi,
> 
> I have a very huge sparse (395000 x 395000) CSC matrix that I cannot
> save in one pass, so I saved the data, indices, indptr and shape in
> separate files as suggested by Dave Wade-Farley a few years back.
> 
> When I try to read back the indices pickle:
> 
>>> np.save("indices.pickle", mymatrix.indices)
>>>> indices = np.load("indices.pickle.npy")
>>>> indices
> array([394852, 394649, 394533, ...,      0,      0,      0], dtype=int32)
>>>> intersection_matrix.indices
> array([394852, 394649, 394533, ...,   1557,   1223,    285], dtype=int32)
> 
> Why is this happening? My only workaround is to print all of entries
> of intersection_matrix.indices to a file, and read in back which takes
> up to 2 hours. It would be great if I could get np.load to work
> because it is much faster.
> 
> Thanks,
> Ryan
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
> 
> 

-- 
View this message in context: http://old.nabble.com/Problem-with-np.load%28%29-on-Huge-Sparse-Matrix-tp28719518p28776255.html
Sent from the Scipy-User mailing list archive at Nabble.com.




More information about the SciPy-User mailing list