python newbie - slicing a big memory chunk without GC penalties

Giovanni Bajo noway at sorry.com
Sun Feb 2 14:18:35 EST 2003


"Lukasz Pankowski" <lupan at zamek.gda.pl> ha scritto nel messaggio
news:87hebmy0b6.fsf at zamek.gda.pl...

> if it is large you may consider reading by small pieces not all file
> at once:

Yeah, I forgot mentioning I cannot do this because I need to decompress it
while reading. Unless I implement some streaming stuff which I don't plan to
right now. But anyway...

> there is built-in buffer which gives you a view of a porsion without
> sliceing (try help(buffer) in interactive shell):
>
> for i in range(0, len(buf)/512):
>      Process(buffer(buf, i*512, 512)

This seems exactly what I was looking for, thanks. Sadly it does not give me
back the performance I was looking for, so I will have to search somewhere
else (Process() startup cost is probably too high).

> yes, every
>
> buf = buf[512:]
>
> builds a new string a bit shorter, copy, and drops the previous one, so
lots
> of copying for a very long string in each iteration,  O(n**2) in summary

Yeah, I was sortof guessing this.

Thanks
Giovanni






More information about the Python-list mailing list