Why custom objects take so much memory?

jsanshef jsanpedro at gmail.com
Tue Dec 18 14:26:41 EST 2007


Hi,

after a couple of days of script debugging, I kind of found that some
assumptions I was doing about the memory complexity of my classes are
not true. I decided to do a simple script to isolate the problem:

class MyClass:
	def __init__(self,s):
		self.mystring  = s

mylist = []
for i in range(1024*1024):
	mylist.append(MyClass(str(i))) #allocation
#stage 1
mylist = None
gc.collect()
#stage 2

I take measures of the memory consumption of the script at #stage1 and
#stage 2 and I obtain:
#stage1 -> 238MB
#stage2 -> 15MB

That means every object is around 223 bytes in size!!!! That's too
much considering it only contains a string with a maximum size of 7
chars.

If you change the allocation line for this other:
>>mylist.append(str(i)) #we don't create the custom class, but append the string directly into the list

the numbers decrease substantially to:
#stage1 -> 47.6MB
#stage2 -> 15MB
(so this time we can say string vars occupy around 32 bytes....still a
lot, isn't it?)

So, what's exactly going on behind the scenes? Why is using custom
objects SO expensive? What other ways of creating structures can be
used (cheaper in memory usage)?

Thanks a lot in advance!






More information about the Python-list mailing list