[SciPy-User] operations on large arrays

Vincent Davis vincent at vincentdavis.net
Sun Mar 7 01:05:03 EST 2010


I just figured out that I had a few arrays that where taking up a bunch of
the memory. That said I still wonder if there is a better way.

  *Vincent Davis
720-301-3003 *
vincent at vincentdavis.net
 my blog <http://vincentdavis.net> |
LinkedIn<http://www.linkedin.com/in/vincentdavis>


On Sat, Mar 6, 2010 at 10:22 PM, Vincent Davis <vincent at vincentdavis.net>wrote:

> I have arrays of 8-20 rows and 230,000 column, all the data is float64
> I what to be able to find the difference in the correlation matrix between
> arrays
> let A and B be of size (10, 230000)
> np.corrcoef(a)-np.corrcoef(b)
>
> I can't seem to do this with more than 10000 columns at a time because of
> memory limitations. (about 9GB usable to python)
> Is there a better way?
>
> I also have problem finding the column means which is surprising to me, I
> was not able to get the column means for 10000 columns, but I can computer
>  the corrcoeff ?
> np.mean(a, axis=0)
>
> Do I just need to divide up the job or is there a better approach?
>
> Thanks
>
>   *Vincent Davis
> 720-301-3003 *
> vincent at vincentdavis.net
>  my blog <http://vincentdavis.net> | LinkedIn<http://www.linkedin.com/in/vincentdavis>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20100306/748ef99d/attachment.html>


More information about the SciPy-User mailing list