Memory leak in Python

Karthik Gurusamy kar1107 at gmail.com
Tue May 9 23:06:43 EDT 2006


diffuser78 at gmail.com wrote:
> The amount of data I read in is actually small.
>
> If you see my algorithm above it deals with 2000 nodes and each node
> has ot of attributes.
>
> When I close the program my computer becomes stable and performs as
> usual. I check the performance in Performance monitor and using "top"
> and the total memory is being used and on top of that around half a gig
> swap memory is also being used.
>
> Please give some helpful pointers to overcome such memory errors.
>
> I revisited my code to find nothing so obvious which would let this
> leak happen. How to kill cross references in the program. I am kinda
> newbie and not completely aware of the finetuning such programming
> process.
>

I suspect you are trying to store each node's attributes in every other
node.
Basically you have a O(N^2) algorithm (in space and probably more in
time).
For N=2000, N^2 is pretty big and you see memory issues.

Try not to store O(N^2) information and see if you can just scale
memory requirements linearly in N. That is, see if you can store
attributes of a node at only one place per node.

I'm just guessing your implementation; but from what you say
(peer-to-peer), I feel there is a O(N^2) requirements. Also try
experimenting with small N (100 nodes say).

Thanks,
Karthik

> Thanks
>
>
> bruno at modulix wrote:
> > diffuser78 at gmail.com wrote:
> > > I have a python code which is running on a huge data set. After
> > > starting the program the computer becomes unstable and gets very
> > > diffucult to even open konsole to kill that process. What I am assuming
> > > is that I am running out of memory.
> > >
> > > What should I do to make sure that my code runs fine without becoming
> > > unstable. How should I address the memory leak problem if any ? I have
> > > a gig of RAM.
> > >
> > > Every help is appreciated.
> >
> > Just a hint : if you're trying to load your whole "huge data set" in
> > memory, you're in for trouble whatever the language - for an example,
> > doing a 'buf = openedFile.read()' on a 100 gig file may not be a good
> > idea...
> >
> >
> >
> > --
> > bruno desthuilliers
> > python -c "print '@'.join(['.'.join([w[::-1] for w in p.split('.')]) for
> > p in 'onurb at xiludom.gro'.split('@')])"




More information about the Python-list mailing list