Finally figured out member generators...

Bjorn Pettersen BPettersen at NAREX.com
Tue Mar 4 13:54:17 EST 2003


> From: Steven Taschuk [mailto:staschuk at telusplanet.net] 
> 
> Quoth Bjorn Pettersen:
>   [...]
> >  >>> class D:
> >  ...   def __init__(self): self.v = 5
> >  ...   def __iter__(self): return self
> >  ...   def next(self):
> >  ...       for i in range(10):
> >  ...          yield i + self.v
> >  ...
> >  >>> d = D()
> >  >>> list(d)
> > 
> > which will recurse infitely. That __iter__ should return 
> > self.next(), thus creating and returning an iterator 
> > object because of generator semantics didn't occur to 
> > me since the whole point was to have the class
> > be it's own iterator... Oh, well, it was fun anyways :-)
> 
> Um.  I think you're conflating iterators with iterables.  In the
> code above
[...]

Well, yes, I guess I gave it away when I said "the whole point was to
have the class be it's own iterator" <wink>. Although it would certainly
cause problems with multiple simultaneous iterations over the same
container, there shouldn't be anything wrong with doing it when you know
there can only be one for loop (etc.) working on the container.

The way I read the docs, the iterator protocol was simply (a) an
iterable container must have a method __iter__(self), which (b) returns
an object having a method next(self), which when repeadtedly called will
return successive items of the container, and finally (c) next() will
raise StopIteration when all items in the container have been returned.
Here's an example:

>>> class E:
...     def __init__(self):
...         self.value = range(10)
...         self.scale = 5
...     def __iter__(self):
...         self.cur = 0
...         return self
...     def next(self):
...         pos = self.cur
...         self.cur += 1
...         if pos < len(self.value):
...             return self.value[pos] * self.scale
...         else:
...             raise StopIteration
...
>>> e = E()
>>> list(e)
[0, 5, 10, 15, 20, 25, 30, 35, 40, 45]

The problem with the exmple the top, was simply that by specifying self
as the __iter__, I was implying that d.next() would return the next item
in D [which I think is perfectly reasonable, I mean it says it's going
to "yield" the value right there, you'd almost think there were some
magic wrapping going on -- who was it that said it wouldn't confuse us
to reuse 'def' to declare generators?..... (tim?) <wink>]

The reality is, of course, that d.next() returns not an item in D, but
an (generator-)iterator (an object containing next() that will produce
the items).

[... how to use separate iterator classes..]

I agree that separate iterator classes are very explicit and have a
number of good qualities, they seem heavy-handed when it comes to Python
though (remember we came from the __getitem__ protocol...) Also, I would
probably make the iterator an inner class (doesn't pollute the
namespace, and just have a look at the C++ stl for what you can do when
common parts of classes are named the same :-):

   class MyContainer:
       class iterator:
           def __init__(self):...
           def __iter__(self): return iterator(self)
           def next(self): ...
       etc.

[...]
> A generator function, as you say, is handy here, because it lets
> you avoid writing a separate iterator class:
> 
> 	class WithGenerator:
> 		def __init__(self, scale):
> 			self.scale = scale
> 		def __iter__(self):
> 			for i in range(5):
> 				yield i*self.scale

Agreed. I haven't used generators much until now (at least not in Python
:-), but I think we're going to move to 2.3 final where they're
standard, etc., etc.

Thanks,
-- bjorn





More information about the Python-list mailing list