Fast recursive generators?

Terry Reedy tjreedy at udel.edu
Fri Oct 28 22:27:40 EDT 2011


On 10/28/2011 8:49 PM, Michael McGlothlin wrote:

>> Better to think of a sequence of values, whether materialized as a 'list' or
>> not.
>
> The final value will actually be a string but it seems it is usually
> faster to join a list of strings than to concat them one by one.

.join() takes an iterable of strings as its argument

>> Comprehensions combine map and filter, both of which conceptually work on
>> each item of a pre-existing list independently. (I am aware that the
>> function passed can stash away values to create dependence.
>
> The problem is I don't really have a pre-existing list.

So, as I said, map, filter, and comprehensions are not applicable to 
your problem.

>> def do(func, N, value):
>>     yield value
>>     for i in range(1,N):
>>         value = func(value)
>>         yield value
>>
>> For more generality, make func a function of both value and i.
>> If you need a list, "l = list(do(f,N,x))", but if you do not, you can do
>> "for item in do(f,N,x):" and skip making the list.
>
> Generators seem considerably slower than using comprehension or
> map/filter

So what? A saw may cut faster than a hammer builds, but I hope you don't 
grab a saw when you need a hammer.

 > /reduce.

Do you actually have an example where ''.join(reduce(update, N, start)) 
is faster than ''.join(update_gen(N, start))? Resuming a generator n 
times should be *faster* than n calls to the update function of reduce 
(which will actually have to build a list).

---
Terry Jan Reedy




More information about the Python-list mailing list