iters on ints? (reducing the need for range/xrange)
William Tanksley
wtanksle at dolphin.openprojects.net
Mon Nov 12 18:22:39 EST 2001
On Sat, 10 Nov 2001 01:30:38 +0000 (UTC), Huaiyu Zhu wrote:
>On Fri, 09 Nov 2001 23:30:25 GMT, William Tanksley
><wtanksle at dolphin.openprojects.net> wrote:
>>What if slicing/indexing the "int" builtin resulted in a generator with
>>the desired behavior?
>>for x in int[10]:
>>for x in int[1:11]:
>>for x in int[1:11:2]:
>Best proposal so far.
Very kind of you. Of course, it's faint praise -- the previous proposals
didn't look like they had any chance either.
>># now some hard ones
>>for x in int[0:]: # all the natural numbers
>>for x in int[0::2]: # all the even naturals
>>for x in int[:0]: # illegal
>>for x in int[:0:-1]: legal, all negative numbers
>The negative steps are tricky. The last one should be int[0::-1].
Ah. Unfortunately, you're right. An interesting special case.
>But what would x[a:b:-1] do if x is an iterator? Maybe that's useful for a
>"bidirectional iterator"? A unidirectional iterator would raise exception
>and say "no attribute 'prev'" and the like.
Setting up a starting point on an iterator is not a very clear concept,
nor is setting a negative skip. Many iterators can handle it easily;
others clearly can't.
I seem to recall that the bytecode, and therefore function call, for [x],
[x:y], and [x:y:z] are different. It's likely that the solution is to use
the difference: if the iterator doesn't handle that type of "slicing", it
shouldn't implement that method.
>It might be useful to abstract out the slicing objects itself, which is used
>in NumPy arrays. So 1:10:2 would be the syntax for creating a slicing
>object.
No, that's the same as the already-rejected PEP. It has no chance to
survive.
>To enable steps of other integers, you'd need an optional argument in next
>class someiterator:
> def next(self, step=1):
No, you really wouldn't; I mean, it might be nice, but all you NEED is to
call next() repeatedly.
>>Wait -- a question. Do slices act on generators? That is, can I slice
>>out every other item of a generator (as I just did above)? If not, could
>>that be added? Once that's added, what's to stop me from annoyingly
>>demanding to be able to slice out every 'n'th item (keeping the rest as
>>part of the generated sequence)? Such a thing would feed my hunger for an
>>interesting prime # sieve...
>What would be the semantics? Is it iterate and discard, or simply jump?
Jump would be ideal. Iterate and discard is the only currently generally
possible semantics. I would suggest that just as all iterators have a
next() method, so also some iterators could have a __slice_range__ and
__slice_gap__ methods. If they don't have those methods, you can't use
that type of slicing on them (just as if they don't have __getitem__ you
can't do indexing on them).
>>Wait, I'm hearing a voice. It tells me, "WRITE A PEP."
>Well, go for it. :-)
I can't right now -- I'm too busy resisting the temptation to write a PEP.
Must ... resist ...
>Huaiyu
--
-William "Billy" Tanksley
More information about the Python-list
mailing list