[Baypiggies] reading very large files

Tim Chon devchon at gmail.com
Tue May 17 19:29:50 CEST 2011


You can use a generator, best answered here,
http://stackoverflow.com/questions/519633/lazy-method-for-reading-big-file-in-python

--tim

On 5/17/11, Vikram K <kpguy1975 at gmail.com> wrote:
> I wish to read a large data file (file size is around 1.8 MB) and manipulate
> the data in this file. Just reading and writing the first 500 lines of this
> file is causing a problem. I wrote:
>
> fin = open('gene-GS00471-DNA_B01_1101_37-ASM.tsv')
> count = 0
> for i in fin.readlines():
>     print i
>     count += 1
>     if count >= 500:
>         break
>
> and got this error msg:
>
> Traceback (most recent call last):
>   File
> "H:\genome_4_omics_study\GS000003696-DID\GS00471-DNA_B01_1101_37-ASM\GS00471-DNA_B01\ASM\gene-GS00471-DNA_B01_1101_37-ASM.tsv\test.py",
> line 3, in <module>
>     for i in fin.readlines():
> MemoryError
>
> -------
> is there a way to stop python from slurping all the  file contents at once?
>


More information about the Baypiggies mailing list