[Tutor] lazy? vs not lazy? and yielding

Steven D'Aprano steve at pearwood.info
Fri Mar 5 00:07:34 CET 2010


On Fri, 5 Mar 2010 07:57:18 am Andreas Kostyrka wrote:

> > A list comprehension builds a whole list at one time.  So if the
> > list needed is large enough in size, it'll never finish, and
> > besides, you'll run out of memory and crash.  A generator
> > expression builds a function instead which *acts* like a list, but
> > actually doesn't build the values
>
> Well, it act like an iterable. A list-like object would probably have
> __getitem__ and friends. Historically (as in I doubt you'll find a
> version that still does it that way in the wild, I looked it up,
> guess around 2.2 iterators were introduced), the for loop in python
> has been doing a "serial" __getitem__ call.

for loops still support the serial __getitem__ protocol. While it is 
obsolete, it hasn't been removed, as it is still needed to support old 
code using it.

When you write "for item in thing", Python first attempts the iterator 
protocol. It tries to call thing.next(), and if that succeeds, it 
repeatedly calls that until it raises StopIteration.

If thing doesn't have a next method, it tries calling 
thing.__getitem__(0), __getitem__(1), __getitem__(2), and so on, until 
it raises IndexError.

If thing doesn't have a __getitem__ method either, then the loop fails 
with TypeError.



-- 
Steven D'Aprano


More information about the Tutor mailing list