very large dictionary

Simon Strobl Simon.Strobl at gmail.com
Mon Aug 4 10:02:16 EDT 2008


On 4 Aug., 00:51, Avinash Vora <avinashv... at gmail.com> wrote:
> On Aug 4, 2008, at 4:12 AM, Jörgen Grahn wrote:
>
> > (You might want to post this to comp.lang.python rather than to me --
> > I am just another c.l.p reader.  If you already have done to, please
> > disregard this.)
>
> Yeah, I hit "reply" by mistake and didn't realize it.  My bad.
>
> >>> (I assume here that Berkeley DB supports 7GB data sets.)
>
> >> If I remember correctly, BerkeleyDB is limited to a single file size
> >> of 2GB.
>
> > Sounds likely.  But with some luck maybe they have increased this in
> > later releases?  There seem to be many competing Berkeley releases.
>
> It's worth investigating, but that leads me to:
>
> >> I haven't caught the earlier parts of this thread, but do I
> >> understand correctly that someone wants to load a 7GB dataset into
> >> the
> >> form of a dictionary?
>
> > Yes, he claimed the dictionary was 6.8 GB.  How he measured that, I
> > don't know.
>
> To the OP: how did you measure this?

I created a python file that contained the dictionary. The size of
this file was 6.8GB. I thought it would be practical not to create the
dictionary from a text file each time I needed it. I.e. I thought
loading the .pyc-file should be faster. Yet, Python failed to create
a .pyc-file

Simon



More information about the Python-list mailing list