getting MemoryError with dicts; suspect memory fragmentation

Emin.shopper Martinian.shopper emin.shopper at gmail.com
Thu Jun 3 20:06:15 EDT 2010


On Thu, Jun 3, 2010 at 7:41 PM, dmtr <dchichkov at gmail.com> wrote:
> On Jun 3, 3:43 pm, "Emin.shopper Martinian.shopper"
> <emin.shop... at gmail.com> wrote:
>> Dear Experts,
>>
>
> Are you sure you have enough memory available?
> Dict memory usage can jump x2 during re-balancing.
>

I'm pretty sure. When I did

 p setattr(self,'q',dict([(xxx,xxx) for xxx in range(1300)]))

the memory increased trivially (less than 1 MB) but when I did

 p setattr(self,'q',list([list(xxx)+list(xxx)+list(xxx)+list(xxx) for
xxx in self.data]))

it increased by 600 MB.

After getting back to the original 900 MB memory usage, doing

 p setattr(self,'q',dict([(xxx,xxx) for xxx in range(1400)]))

gave a memory error suggesting it isn't the amount of memory available
that is the problem but something like fragmentation.

Thanks,
-Emin



More information about the Python-list mailing list