Memory Error while constructing Compound Dictionary

Alex Martelli aleaxit at yahoo.com
Tue Sep 7 17:55:19 EDT 2004


Benjamin Scott <mynewjunkaccount at hotmail.com> wrote:
   ...
> len(Lst)=1000
> len(nuerLst)=250
> len(nuestLst)=500

So you want 1000*250*500 = 125 million dictionaries...?

> Specs:
> 
> Python 2.3.4
> XPpro
> 4 GB RAM
> 
> 
> Python was utilizing 2.0 GB when the error was generated.  I have

So you've found out that a dictionary takes at least (about) 16 bytes
even when empty -- not surprising since 16 bytes is typically the least
slice of memory the system will allocate at a time.  And you've found
out that XP so-called pro doesn't let a user program have more than 2GB
to itself -- I believe there are slight workarounds for that, as in
costly hacks that may let you have 3GB or so, but it's not going to help
if you want to put any informatiion in those dictionaries, even a tiny
amount of info per dict will easily bump each dict's size to 32 bytes
and overwhelm your 32-bit processor's addressing capabilities (I'm
assuming you have a 32-bit CPU -- you don't say, but few people use
64-bitters yet).

What problem are you really trying to solve?  Unless you can splurge
into a 64-bit CPU with an adequate OS (e.g., AMD 64 with a Linux for it,
or a G5-based Mac) anything requiring SO many gigabytes probably needs a
radical rethink of your intended architecture/strategy, and it's hard to
give suggestions without knowing what problem you need to solve.


Alex



More information about the Python-list mailing list