Python slow for filter scripts

Delaney, Timothy C (Timothy) tdelaney at avaya.com
Tue Oct 28 23:39:07 EST 2003


> From: Alex Martelli [mailto:aleax at aleax.it]
> 
> What kind of files do your scripts most often process?  For me, a
> textfile of 4.4 MB is larger than typical.  How much do those few
> tens of milliseconds' difference matter?  You know your apps, I
> don't, but I _would_ find it rather strange if they "disqualified"
> either language.  Anything below about a second is typically fine
> with me, so even the slowest of these programs could still handle
> files of about 6 MB, assuming the 50% CPU it got is pretty typical,
> while still taking no more than about 1 second's elapsed time.

Personally, I used a python script the other day to process a 550MB text file. Total processing took about an hour.

Note that this is not just a case of *copying* the file - there was some real serious processing going on there. It's a fairly unoptimised program, and peaked at about 800MB RAM. My work machine has the grunt to handle it, so there's no need to optimise the code further for now.

Just doing a simple line-by-line copy of the same file, excluding some lines (keeping about 90%) which included a search in each line, took about 2-3 minutes.

Tim Delaney





More information about the Python-list mailing list