Sorted and reversed on huge dict ?

Sion Arrowsmith siona at chiark.greenend.org.uk
Mon Nov 6 10:33:16 EST 2006


 <vd12005 at yahoo.fr> wrote:
>so i just have tried, even if i think it will not go to the end => i
>was wrong : it is around 1.400.000 entries by dict...
>
>but maybe if keys of dicts are not duplicated in memory it can be done
>(as all dicts will have the same keys, with different (count) values)?

I've had programs handling dicts with 1.7million items and it isn't
a great problem, providing you're careful not to duplicate data.
Creating a copy of keys() in a separate list, for example, will
inflate memory usage noticably.

>memory is 4Gb of ram, [ ... ]

Unless you're on a 64bit OS, that's irrelevant. You'll hit the 2G
per-process limit before you start putting a strain on real memory.

-- 
\S -- siona at chiark.greenend.org.uk -- http://www.chaos.org.uk/~sion/
  ___  |  "Frankly I have no feelings towards penguins one way or the other"
  \X/  |    -- Arthur C. Clarke
   her nu becomeþ se bera eadward ofdun hlæddre heafdes bæce bump bump bump



More information about the Python-list mailing list