Memory not being released?

Darrell news at dorb.com
Mon Nov 15 14:19:01 EST 1999


Here's an idiom I like in this case.
Store your objects in a dict based on id(obj), just as you've done.
Now if objects must reference each other use the id(obj)
and look up the obj. And delete them from the dict when needed.
Yes it's slow but only use it in these big ugly cyclic data structures.

But notice how fast memory gets used up.
This eats 84meg.
>>> dict={}
>>> for x in xrange(1000000):
...     dict[x]=(x,x+1)
...
>>>

You might take a look at shelve. It gives you a persistent dictionary.
I've been playing with a flat file manager that can reclaim freed space.

--
--Darrell
Steve Tregidgo <smst at bigfoot.com> wrote in message
news:80pdvo$99r$1 at nnrp1.deja.com...
> Hi there,
>
> I'm having problems with a large process, and I'd really appreciate
> some help in tracking down the cause of its bloatedness.
>







More information about the Python-list mailing list