python reading file memory cost

Dan Stromberg drsalists at gmail.com
Mon Aug 1 12:30:54 EDT 2011


A code snippet would work wonders in making sure you've communicated what
you really need, or at least what you have now.

But if you read the data into one big string, that'll be much more efficient
than if you read it as a list of integers or even as a list of lines.

Processing the data one chunk or one line at a time will be far more
memory-efficient.

2011/8/1 Tong Zhang <warriorlance at gmail.com>

> Hello, everyone!****
>
> ** **
>
> I am trying to read a little big txt file (~1 GB) by python2.7, what I want
> to do is to read these data into a array, meanwhile, I monitor the memory
> cost, I found that it cost more than 6 GB RAM! So I have two questions:***
> *
>
> 1: How to estimate memory cost before exec python script?****
>
> 2: How to save RAM while do not increase exec time?****
>
> ** **
>
> Any answer will be grateful!****
>
> ** **
>
> --Tony Zhang****
>
> --
> http://mail.python.org/mailman/listinfo/python-list
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-list/attachments/20110801/6318a1bc/attachment-0001.html>


More information about the Python-list mailing list