converting a sed / grep / awk / . . . bash pipe line into python

Roy Smith roy at panix.com
Wed Sep 3 09:41:27 EDT 2008


In article 
<7f2d4b4a-bc97-4b46-a31e-63f98e9fee73 at 34g2000hsh.googlegroups.com>,
 bearophileHUGS at lycos.com wrote:

> Roy Smith:
> > No reason to limit how many splits get done if you're
> > explicitly going to slice the first two.
> 
> You are probably right for this problem, because most lines are 2
> items long, but in scripts that have to process lines potentially
> composed of many parts, setting a max number of parts speeds up your
> script and reduces memory used, because you have less parts at the
> end.
> 
> Bye,
> bearophile

Sounds like premature optimization to me.  Make it work and be easy to 
understand first.  Then worry about how fast it is.

But, along those lines, I've often thought that split() needed a way to not 
just limit the number of splits, but to also throw away the extra stuff.  
Getting the first N fields of a string is something I've done often enough 
that refactoring the slicing operation right into the split() code seems 
worthwhile.  And, it would be even faster :-)



More information about the Python-list mailing list