python shutting down sloooooooowly/tuning dictionaries

Till Plewe till at score.is.tsukuba.ac.jp
Wed Apr 21 11:39:23 EDT 2004


Is there a way to speed up killing python from within a python program?
Sometimes shutting down takes more than 10 times as much time as the actual
running of the program. 

The programs are fairly simple (searching/organizing large boardgame
databases) but use a lot of memory (1-6GB).  The memory is mostly used
for simple structures like trees or relations. Typically there will be
a few large dictionaries and many small dictionaries/sets/arrays/lists. 

Since the entire database is too large I typically run the same
program 10-100 times on smaller parts. The annoying bit is that the
time for finishing one instance of a program before starting the next
instance can take a lot of time.  I have started using a shell script to
check whether certain files have been written and then kill the
program from the os to speed up everything. But this solution is way 
too ugly. There should be a better way, but I don't seem to be able to
find one.

A second question. Is it possible to control the resizing of
dictionaries?  It seems that resizing always doubles the size, but for
dictionaries using more than half of the remaining memory than this
is a problem. I would rather have a slightly too full dictionaries in
main memory than have a sparse dictionary partially swapped out.

I would like to be able to restrict the maximum growth of each resize
operation to around 100MB or so.

TIA.

- Till





More information about the Python-list mailing list