potential feature "Yield in <Iterable>"

Ismael Harun ismaelharunid at gmail.com
Tue Nov 15 22:58:04 EST 2022


This is about a feature suggestion regarding having the ability to have parallel generators.  Currently you can have generators and iterators that can yield a single value. or a collection of values (still a single value).  But what if you could yield to multiple objects.  In a sense build multiple generators but without having to set these up as return objects.

The main concept is having a way to push values to multiple iterable objects
from the same loop or code.  We have the ability to zip multable iterables 
for use in parallel.  Why not the reverse? As in create multiple generators in parallel.  It's just an idea, and I haven't thought it through thoroughly nut I wanted to see if anyone had thoughts, critics or feedback. 

It would require not only a parser change, but also a new type which I am calling a Collector in my example.  A class that has a method used to receive new items.  Such a thing could be asynchronous.  . 

Something like:
```
def split_values(values, *collectors):
    for i in values:
        for n, c in enumerate(collectors, 1):
            yield i**n in c


class MyCollector(list, Collector):
   def __receive__(self, value):
        yield value

  
collector0 = MyCollector()
collector1 = MyCollector()

split_values(range(1,6), collector0, collector1)

for i in collector0:
    print(i)
for i in collector1:
    print(i)
```

Which would result in output:
```
1
2
3
4
5
1
4
9
16
25
```


More information about the Python-list mailing list