Memory management

Larry Whitley ldw at us.ibm.com
Tue Jul 25 11:59:33 EDT 2000


I'm running out of virtual memory on my NT 4.0 system running a longish
python script.  Python says "memory error" and quits.  NT shows the
application using 130+ megabytes.  I have 192 megabytes on the system so it
seems like a rational number.  I suspect that this means that I have a
memory leak in my code.

In the outer loop of my code, I create a object of a class I have defined.
The outer loop reads records from a trace file and hands them to the object
created for execution.  This first level object creates a number of second
level objects of a different class while processing the trace file.  When
the trace file is complete, the outer loop deletes the first level object.
I am expecting that to cause all of the references held by that object, and
those of the second level objects is created to go to zero.

Here's a programmtic description:

def outerloop:
    # make a list of trace files to process
    for file in infiles:
        o1 = FirstLevelClass( some, parameters) # create the object that
will process the trace record
        o1.runTrace( file )
        o1.report()
        del o1

class FirstLevelClass:
    def __init__( self, some, parameters):
        aDict = {} # these get filled over time
        aList = []
    def run(file):
        ...
        o1a = SecondLevelClass( some, parameters)
        ...

class SecondLevelClass
    def __init__(self, some, parameters)
        aDict = {}# these get filled over time
        aList = []

My question:  How do I get rid of the memory used by o1 and by o1a, o1b, etc
within o1 when each trace file is processed?

Larry






More information about the Python-list mailing list