checking available system memory?

David M. Cooke cookedm+news at physics.mcmaster.ca
Fri Oct 15 16:18:50 EDT 2004


Darren Dale <dd55 at cornell.edu> writes:

> I am doing linear algebra with large numarray. It is very efficient, but I
> have a small problem due to the size of my data. The dot product of a
> 10,000x3 double array with a 3x6,250,000 double array will consume 500GB of
> memory. I need to break the operations up into managable chunks, so I dont
> consume all the available memory and get a segmentation fault.
>
> Its not a problem with numpy, I just need to intelligently slice up one of
> my arrays so my routine works within the available system resources. Are
> there any utilities that can query how much memory is available?

Not really, it tends to quite operating-system specific.

Instead of saying "What's the largest chunk I can do at a time", how
about "What's the smallest chunk, where bigger chunks won't get me
much?". If you operate on chunks that are on the order of the cache
size of the processor, that's probably sufficient.

Also, if you're using numarray.dot, note that it doesn't use BLAS (yet), so
it's not as efficient as it could be if it used it (through ATLAS, for
instance).

-- 
|>|\/|<
/--------------------------------------------------------------------------\
|David M. Cooke
|cookedm(at)physics(dot)mcmaster(dot)ca



More information about the Python-list mailing list