[Numpy-discussion] very large matrices.

Charles R Harris charlesr.harris at gmail.com
Sun May 13 10:30:47 EDT 2007


On 5/13/07, Dave P. Novakovic <davidnovakovic at gmail.com> wrote:
>
> They are very large numbers indeed. Thanks for giving me a wake up call.
> Currently my data is represented as vectors in a vectorset, a typical
> sparse representation.
>
> I reduced the problem significantly by removing lots of noise. I'm
> basically recording traces of a terms occurrence throughout a corpus
> and doing an analysis of the eigenvectors.
>
> I reduced my matrix to  4863 x 4863 by filtering the original corpus.
> Now when I attempt svd, I'm finding a memory error in the svd routine.
> Is there a hard upper limit of the size of a matrix for these
> calculations?


I get the same error here with linalg.svd(eye(5000)), and the memory is
indeed gone. Hmm, I think it should work, although it is sure pushing the
limits of what I've got: linalg.svd(eye(1000)) works fine. I think 4GB
should be enough if your memory limits are set high enough.

Are you trying some sort of principal components analysis?

<snip>

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20070513/1f72f453/attachment.html>


More information about the NumPy-Discussion mailing list