[Python-Dev] extremely slow exit for program having huge (45G) dict (python 2.5.2)

Antoine Pitrou solipsis at pitrou.net
Mon Dec 22 23:35:30 CET 2008


Martin v. Löwis <martin <at> v.loewis.de> writes:
> 
> It then occurred that there are only 64 different values for nfreepools,
> as ARENA_SIZE is 256kiB, and POOL_SIZE is 4kiB. So rather than keeping
> the list sorted, I now propose to maintain 64 lists, accessible in
> an array double-linked lists indexed by nfreepools. Whenever nfreepools
> changes, the arena_object is unlinked from its current list,  and linked
> into the new list. This should reduce the overhead for keeping the lists
> sorted down from O(n) to O(1), with a moderate overhead of 64 pointers
> (512 Bytes in your case).
> 
> Allocation of a new pool would have to do a linear search in these
> pointers (finding the arena with the least number of pools);

You mean the least number of free pools, right? IIUC, the heuristic is to favour
a small number of busy arenas rather than a lot of sparse ones.
And, by linear search in these pointers, do you mean just probe the 64 lists for
the first non-NULL list head? If so, then it's likely fast enough for a rather
infrequent operation.

Now, we should find a way to benchmark this without having to steal Mike's
machine and wait 30 minutes every time.

Regards

Antoine.




More information about the Python-Dev mailing list