[Python-Dev] Improving the Python Memory Allocator

Evan Jones ejones at uwaterloo.ca
Mon Jan 24 14:50:19 CET 2005


On Jan 24, 2005, at 7:00, Rodrigo Dias Arruda Senra wrote:
> Depending on the cost of arena allocation, it might help to define a 
> lower threshold keeping a minimum of empty arena_objects permanently 
> available. Do you think this can bring any speedup ?

Yes, I think it might. I have to do some more benchmarking first, to 
try and figure out how expensive the allocations are. This is one of my 
"future work" items to work on if this change gets accepted. I have not 
implemented it yet, because I don't want to have to merge one *massive* 
patch. My rough idea is to do something like this:

1. Keep track of the largest number of pages in use at one time.

2. Every N memory operations (or some other measurement of "time"), 
reset this value and calculate a moving average of the number of pages. 
This estimates the current memory requirements of the application.

3. If (used + free) > average, free arenas until freeing one more arena 
would make (used + free) < average.

This is better than a static scheme which says "keep X MB of free 
memory around" because it will self-tune to the application's 
requirements. If you have an applications that needs lots of RAM, it 
will keep lots of RAM. If it has very low RAM usage, it will be more 
aggressive in reclaiming free space. The challenge is how to determine 
a good measurement of "time." Ideally, if the application was idle for 
a while, you would perform some housekeeping like this. Does Python's 
cyclic garbage collector currently do this? If so, I could hook this 
"management" stuff on to its calls to gc.collect()

Evan Jones



More information about the Python-Dev mailing list