PEP 289: Generator Expressions (please comment)

kirby urner urnerk at qwest.net
Mon Nov 3 02:49:12 EST 2003


So list comprehensions don't just produce iterables, they produce
indexables, i.e. [func(j) for j in range(n)][10] makes sense (if
n>10).

So presumably a list generator, as a kind of "lazy list comprehension"
would be forced to iterate up to whatever index was called on it (via
__getitem__), in order to return (genexpr)[n].  Of course simple
generators don't implement __getitem__.

So what happens to members of the list that have been evaluated? 
Cached somehow?

I'm wondering if there's any confusion about how a generator *inside*
a list generator would be evaluated i.e. right now it makes sense to
go [r for r in gen(r)] if gen(r) has a stop condition -- the
comprehension syntax will force gen(r) to the end of its run.  Lazy
evaluation would suggest generators with no terminus might be enclosed
e.g. d=(p for p in allprimes()).  After which, d[100] would return a
hundredth prime (d[0] --> 2).  So d[50] would now be retained in
memory somewhere, but d[1000] would trigger further iterations?  And
we can do slices too?

Kirby




More information about the Python-list mailing list