[Python-ideas] Iterable function calls and unpacking [was: PEP for issue2292, "Missing *-unpacking generalizations"]

Oscar Benjamin oscar.j.benjamin at gmail.com
Mon Jul 15 22:43:46 CEST 2013


On 15 July 2013 21:32, Joshua Landau <joshua at landau.ws> wrote:
> I've moved this to a different thread, as I agree with Guido that it's
> a different PEP.
>
> On 15 July 2013 21:06, Guido van Rossum <guido at python.org> wrote:
>> On Mon, Jul 15, 2013 at 1:01 PM, Oscar Benjamin
>> <oscar.j.benjamin at gmail.com> wrote:
>>> On 15 July 2013 12:08, Joshua Landau <joshua.landau.ws at gmail.com> wrote:
>>>> On 15 July 2013 11:40, Oscar Benjamin <oscar.j.benjamin at gmail.com> wrote:
>>>>
>>>> In fact, I'd much like it if there was an iterable "unpacking" method
>>>> for functions, too, so "chain.from_iterable()" could use the same
>>>> interface as "chain" (and str.format with str.format_map, etc.). I
>>>> feel we already have a good deal of redundancy due to this.
>>>
>>> I've also considered this before. I don't know what a good spelling
>>> would be but lets say that it uses *args* so that you have a function
>>> signature like:
>>>
>>>     def chain(*iterables*):
>>>        for iterable in iterables:
>>>             yield from iterable
>
> Personally some form of decorator would be simpler:
>
> @lazy_unpack()
> def chain(*iterables):
>     ...

How would the above decorator work? It would need to exploit some new
capability since this requires unpacking everything:

def lazy_unpack(func):
    @wraps(func)
    def wrapper(*args **kwargs):
        # The line above has already expanded *args
        return func(*args, **kwargs)
    return wrapper

@lazy_unpack
def chain(*iterables):
    ...

> (and it could theoretically work for mappings too, by just "bundling"
> them; useful for circumstances where the mapping is left untouched and
> just passed to the next function in line.)

I don't understand. Do you mean to use it somehow for **kwargs?

>>> And then if the function is called with
>>>
>>>     for line in chain(first_line, *inputfile):
>>>         # do stuff
>>>
>>> then iterables would be bound to a lazy generator that chains
>>> [first_line] and inputfile. Then you could create the unpacking
>>> iterator I wanted by just using chain e.g.:
>>>
>>>     chain(prepend, *iterable, append)
>>
>> But how could you do this without generating different code depending
>> on how the function you are calling is declared? Python's compiler
>> doesn't have access to that information.
>
> You could simply make the code such that if has an unpack inside the
> call it does a run-time check. Whilst this will be slower for the
> false-positives, the number of times *args is pass-through (and thus
> you save a redundant copy of the argument tuple) and *args is a simple
> loop-once construct makes it plausible that those losses would be
> outweighed.

It probably would be better to have a specific syntax at the calling
site since you probably want to know when you look at
f(*infinite_iterator) whether or not infinite_iterator is going to be
expanded.


Oscar


More information about the Python-ideas mailing list