Is there no compression support for large sized strings in Python?
Claudio Grondi
claudio.grondi at freenet.de
Thu Dec 1 17:37:02 EST 2005
"Harald Karner" <harald_karner at a1.net> schrieb im Newsbeitrag
news:newscache$t6wtqi$jy21$1 at inet.ecofinance.com...
> Claudio Grondi wrote:
> > Anyone on a big Linux machine able to do e.g. :
> > \>python -c "print len('m' * 2500*1024*1024)"
> > or even more without a memory error?
>
> I tried on a Sun with 16GB Ram (Python 2.3.2)
> seems like 2GB is the limit for string size:
>
> > python -c "print len('m' * 2048*1024*1024)"
> Traceback (most recent call last):
> File "<string>", line 1, in ?
> OverflowError: repeated string is too long
>
> > python -c "print len('m' * ((2048*1024*1024)-1))"
> 2147483647
In this context I am very curious how many of such
2 GByte strings is it possible to create within a
single Python process?
i.e. at which of the following lines executed
as one script is there a memory error?
dataStringA = 'A'*((2048*1024*1024)-1) # 2 GByte
dataStringB = 'B'*((2048*1024*1024)-1) # 4 GByte
dataStringC = 'C'*((2048*1024*1024)-1) # 6 GByte
dataStringD = 'D'*((2048*1024*1024)-1) # 8 GByte
dataStringE = 'E'*((2048*1024*1024)-1) # 10 GByte
dataStringF = 'F'*((2048*1024*1024)-1) # 12 GByte
dataStringG = 'G'*((2048*1024*1024)-1) # 14 GByte
let 2 GByte for the system on a 16 GByte machine ... ;-)
Claudio
More information about the Python-list
mailing list