[issue9520] Add Patricia Trie high performance container

Dmitry Chichkov report at bugs.python.org
Fri Aug 6 03:40:23 CEST 2010


Dmitry Chichkov <dchichkov at gmail.com> added the comment:

No. I'm not simply running out of system memory. 8Gb/x64/linux. And in my test cases I've only seen ~25% of memory utilized. And good idea. I'll try to play with the cyclic garbage collector.

It is harder than I thought to make a solid synthetic test case addressing that issue. The trouble you need to be able to generate data (e.g. 100,000,000 words/5,000,000 unique) with a distribution close to that in the real life scenario (e.g. word lengths, frequencies and uniqueness in the english text). If somebody have a good idea onto how to do it nicely - you'd be very welcome. 

My best shot so far is in the attachment.

----------
Added file: http://bugs.python.org/file18406/dc.dict.bench.0.01.py

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue9520>
_______________________________________


More information about the Python-bugs-list mailing list