Berkely Db. How to iterate over large number of keys "quickly"

Christoph Haas email at christoph-haas.de
Thu Aug 2 16:32:25 EDT 2007


On Thu, Aug 02, 2007 at 07:43:58PM -0000, lazy wrote:
> I have a berkely db and Im using the bsddb module to access it. The Db
> is quite huge (anywhere from 2-30GB). I want to iterate over the keys
> serially.
> I tried using something basic like
> 
> for key in db.keys()
> 
> but this takes lot of time. I guess Python is trying to get the list
> of all keys first and probbaly keep it in memory. Is there a way to
> avoid this, since I just want to access keys serially.

Does db.iterkeys() work better?

 Christoph




More information about the Python-list mailing list