python shutting down sloooooooowly/tuning dictionaries

Till Plewe till at score.is.tsukuba.ac.jp
Wed Apr 21 12:38:01 EDT 2004


On Wed, Apr 21, 2004 at 06:12:15PM +0200, Brian Quinlan wrote:
> >Since the entire database is too large I typically run the same
> >program 10-100 times on smaller parts. The annoying bit is that the
> >time for finishing one instance of a program before starting the next
> >instance can take a lot of time.  ...
> 
> You might try disabling garbage collection using the gc module when you 
> your application is done. Explicitly closing any files bound to globals 
> might then be a good idea.
> 
> Cheers,
> Brian
> 

Thanks for the suggestions but unfortunately I already tried disabling
gc. It did not have any noticable effect (not that I was patient
enough to time it) nor did using sys.exit(). Also, I am not using
threads which excludes another possible explanation. 

To give an example of the actual time spans involved: 2-10 minutes for
running the program and > 30 minutes for shutdown. During that time
the amount of memory used by python does not seem to vary.

A far as closing files is concerned, I am using cdb for writing 
read-only databases and I am using cdb.finish() (there is no explicit
method for closing(); finish writes all key/value pairs to a temporary
file and then moves that file to wherever you want it. But as far as I can see
as soon as that file is written/has been moved no more I/O should take place.
There is no data loss if I simply kill the python program. 
 
- Till





More information about the Python-list mailing list