checking available system memory?
Jeremy Bowers
jerf at jerf.org
Fri Oct 15 19:29:48 EDT 2004
On Fri, 15 Oct 2004 15:59:12 -0400, Darren Dale wrote:
> I am doing linear algebra with large numarray. It is very efficient, but I
> have a small problem due to the size of my data. The dot product of a
> 10,000x3 double array with a 3x6,250,000 double array" will consume
> 500GB of memory. I need to break the operations up into managable
> chunks, so I dont consume all the available memory and get a
> segmentation fault.
>
> Its not a problem with numpy, I just need to intelligently slice up one
> of my arrays so my routine works within the available system resources.
> Are there any utilities that can query how much memory is available?
I don't know what you're doing with that, but you're well into the domain
where you may have to trade running time for memory.
I am not familiar with the terms "10,000x3 double array with a 3x6,250,000
double array" (particularly "double array"), but speaking in general
terms, assuming the dot product is something like the vector dot product I
know, you can wrap your two source arrays in an object that lazily
computes the relevant dot product. Shell:
class LazyDotProduct(object):
def __init__(self, a, b):
self.a = a
self.b = b
def __getitem__(self, index):
return dot_prodect(self.a, self.b, index)
Add an optional cache to getitem if you need it and can afford it.
"dot_product" computes the relevant dot product element.
Just a thought; I may be over-extrapolating from what I know.
More information about the Python-list
mailing list