[Tutor] Load Entire File into memory
Steven D'Aprano
steve at pearwood.info
Mon Nov 4 16:53:41 CET 2013
On Mon, Nov 04, 2013 at 02:48:11PM +0000, Dave Angel wrote:
> Now I understand. Processing line by line is slower because it actually
> reads the whole file. The code you showed earlier:
>
> > I am currently using this method to load my text file:
> > *f = open("output.txt")
> > content=io.StringIO(f.read())
> > f.close()*
> > But I have found that this method uses 4 times the size of text file.
>
> will only read a tiny portion of the file. You don't have any loop on
> the read() statement, you just read the first buffer full. So naturally
> it'll be much faster.
Dave, do you have a reference for that? As far as I can tell, read()
will read to EOF unless you open the file in non-blocking mode.
http://docs.python.org/3/library/io.html#io.BufferedIOBase.read
> I am of course assuming you don't have a machine with 100+ gig of RAM.
There is that, of course. High-end servers can have multiple hundreds of
GB of RAM, but desktop and laptop machines rarely have anywhere near
that.
--
Steven
More information about the Tutor
mailing list