very large dictionaries

Larry Bates lbates at swamisoft.com
Wed Jun 16 15:38:12 EDT 2004


1) If the key is 10 bytes, that's 5.5Gb
of memory at a minimum???  That is an
unusually large amount of memory.

2) You can't really "search" through dictionaries
fast, but you can index into them via their key
very fast.  But then you can do a database read
with an index key on a table very quickly also
with no startup time to read an index 50 million
records.

We would really need to know more about what
you mean by "search through" to make an intelligent
suggestion.

Larry Bates
Syscon, Inc.

"robin" <escalation746 at yahoo.com> wrote in message
news:gk61d09qictqsed45re9qng6697e2dm3sk at 4ax.com...
> I need to do a search through about 50 million records, each of which
> are less than 100 bytes wide. A database is actually too slow for
> this, so I thought of optimising the data and putting it all in
> memory.
>
> There is a single key field, so a dictionary is an obvious choice for
> a structure, since Python optimises these nicely.
>
> But is there a better choice? Is it worth building some sort of tree?
>
> -- robin





More information about the Python-list mailing list