[Tutor] String concatenation too slow
wesley chun
wescpy at gmail.com
Tue Jul 1 07:37:04 CEST 2008
> I've been
> using string concatenation to read through the string, and then create
> the new one based on a dictionary lookup. However it becomes very slow
> once the string gets very long (several thousand characters). [...]
> I was wondering if there might be a
> faster alternative to string concatenation. Would appending to a list
> of strings be faster?
without knowing more about your application, my 1st inclination would
be to turn your code that "appends" each successive addition to the
string into a generator function. then when you need to final
massively large string, i'd use a generator expression inside the call
to the delimiter's join() method.
for example:
def nextLsystem(...):
:
for n in range(XXX):
# blah-blah stuff in a loop
yield nextStr
final = ''.join(x for x in nextLsystem(XXX))
i like this code because it doesn't keep building up a data structure
like continuously concatenating strings nor continually appending to a
list, both of which are memory-intensive.
i'm using a generator to create each successive string, without saving
previous result necessarily. then the generator expression -- unlike
a list comprehension which must build an entire list -- passes each
string to join(), which then creates the final string.
i'm sure others have better ideas, but like it said, it's just a gut
shot from me here.
good luck!
-- wesley
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
"Core Python Programming", Prentice Hall, (c)2007,2001
http://corepython.com
wesley.j.chun :: wescpy-at-gmail.com
python training and technical consulting
cyberweb.consulting : silicon valley, ca
http://cyberwebconsulting.com
More information about the Tutor
mailing list