Memory Problem

Bruno Desthuilliers bruno.42.desthuilliers at wtf.websiteburo.oops.com
Tue Sep 18 10:46:27 EDT 2007


Christoph Scheit a écrit :
> On Tuesday 18 September 2007 15:10, Marc 'BlackJack' Rintsch wrote:
>> On Tue, 18 Sep 2007 14:06:22 +0200, Christoph Scheit wrote:
>>> Then the data is added to a table, which I use for the actual
>>> Post-Processing. The table is actually a Class with several "Columns",
>>> each column internally being represented by array.

(snip)

> I have to deal with several millions of data, actually I'm trying an example 
> with
> 360 grid points and 10000 time steps, i.e. 3 600 000 entries (and each row 
> consits of 4 int and one float)

Hem... My I suggest that you use a database then ? If you don't want to 
bother with a full-blown RDBMS, then have a look at SQLite - it's 
lightweight, works mostly fine and is a no-brainer to use.

> Of course, the more keys the bigger is the dictionary, but is there a way to 
> evaluate the actual size of the dictionary?

You can refer to the thread "creating really big lists" for a Q&D, raw 
approx of such an evaluation. But it's way too big anyway to even 
consider storing all this in ram.




More information about the Python-list mailing list