newbie: Some file object tips?

Flavian Hardcastle deathtospam43423 at altavista.com
Tue Feb 19 08:29:33 EST 2002


Gerhard =?unknown-8bit?Q?H=E4ring?= <gh_pythonlist at gmx.de> wrote in
news:mailman.1014120512.7783.python-list at python.org: 

> Le 19/02/02 ? 11:00, Flavian Hardcastle écrivit:
>> 
>> I'm writing a crude cgi chat script. I've already got it up and
>> working with Xitami server, and have chatted with a few net buddies.
>> 
>> How it works is very rough and simple... it basically just takes data
>> from people's browsers (using the FieldStorage function) and stores it
>> in a file called log.txt, and then displays the contents of that file
>> back to them. 
>> 
>> My problem is ... 
>> 
>> The longer you chat, the bigger the log.txt file gets, and
>> consequently the web page the chatters are viewing gets bigger and
>> bigger, and becomes hard to download.
>> 
>> So I want to include a "scroll back" feature ... which will enable the
>> user to control how many previous posts (s)he views. It will probably
>> take the form a simple input box on the form ... if you want to see
>> the last 8 posts, you put 8 in the box, if you want to see the last 50
>> posts, you put 50 in the box and so on.
>> 
>> So, I need the script to be able to search the log.txt file. Any tips
>> as to how I might go about this? 
> 
> You could read the file into a list with
> 
> lines = open("log.txt").readlines()
> 
> but now, all the lines will include a newline character at their end.
> To strip these, you can do the following instead:
> 
> lines = map(string.rstrip, open("log.txt").readlines())
> 
> Now you can use slicing on the list, say to return the last 8 lines,
> you can do:
> 
> lines[-8:]
> 
> HTH,
> 
> Gerhard

Cool! Thanx very much. Never saw that in any of the tutorials!

BTW, one thing I'm concerned about ... I want to avoid the script having to 
load the entire log.txt file into RAM everytime someone makes a post. You 
see, I estimate that a log file with only 200 posts worth of text would 
weigh about 60k. 

If say, ten people, are all posting posting at once that could be as much 
as 600k in the memory. If it's 20 chatters, 1.2 meg. That could take a 
couple of seconds for the servers CPU to process, might noticeably slow 
things down ... especially on my 'puter which is only 500Mhz.

So does this command ....

> lines = open("log.txt").readlines()

... actually load the whole log into RAM.

-- 
cdewin at dingoblue.net.au



More information about the Python-list mailing list