Memory leak in Python

diffuser78 at gmail.com diffuser78 at gmail.com
Tue May 9 21:41:22 EDT 2006


The amount of data I read in is actually small.

If you see my algorithm above it deals with 2000 nodes and each node
has ot of attributes.

When I close the program my computer becomes stable and performs as
usual. I check the performance in Performance monitor and using "top"
and the total memory is being used and on top of that around half a gig
swap memory is also being used.

Please give some helpful pointers to overcome such memory errors.

I revisited my code to find nothing so obvious which would let this
leak happen. How to kill cross references in the program. I am kinda
newbie and not completely aware of the finetuning such programming
process.

Thanks


bruno at modulix wrote:
> diffuser78 at gmail.com wrote:
> > I have a python code which is running on a huge data set. After
> > starting the program the computer becomes unstable and gets very
> > diffucult to even open konsole to kill that process. What I am assuming
> > is that I am running out of memory.
> >
> > What should I do to make sure that my code runs fine without becoming
> > unstable. How should I address the memory leak problem if any ? I have
> > a gig of RAM.
> >
> > Every help is appreciated.
>
> Just a hint : if you're trying to load your whole "huge data set" in
> memory, you're in for trouble whatever the language - for an example,
> doing a 'buf = openedFile.read()' on a 100 gig file may not be a good
> idea...
>
>
>
> --
> bruno desthuilliers
> python -c "print '@'.join(['.'.join([w[::-1] for w in p.split('.')]) for
> p in 'onurb at xiludom.gro'.split('@')])"




More information about the Python-list mailing list