[Python-ideas] deferred default arguments

Eric Snow ericsnowcurrently at gmail.com
Thu Jul 14 05:01:43 CEST 2011


On Wed, Jul 13, 2011 at 7:23 PM, Steven D'Aprano <steve at pearwood.info> wrote:
> Wild idea: if Python had a "signature object", you could do something like

You mean like PEP 362?  I would love it.  Brett has talked about
taking it from PyPI to 3.3, but I'm guessing he is still pretty busy
with life.  (There, I just poked the tracker issue)

>> * Use introspection at definition time (using X.f.__defaults__ and
>> X.f.__kwdefaults__), which is fragile (if the X.f attributes are
>> changed later or the function definition's signature changes).
>
> This becomes messy if you only want to "inherit" the default value for one
> parameter rather than all.

Yeah, I had meant that, and agree it is messy.

>>
>> Provide a builtin version of a Deferred singleton, like None or
>> NotImplemented.  When resolving arguments for a call, if an argument
>> is this Deferred object, replace it with the default argument for that
>> parameter on the called function, if there is one.  If there isn't,
>> act as though that argument was not passed at all.
>
> I don't believe that should be a public object. If it's public, people will
> say "Yes, but what do you do if you want the default to actually be the
> Deferred singleton?" It's the None-as-sentinel problem all over again...
> sometimes you want None to stand in for no value, and sometimes you want it
> to be a first class value.

The idea is to have an object that has no meaning in normal code.
Sure we use None all over, but the sample size for builtin singletons
in Python is pretty small to say that would happen with a new one.
But maybe None started as an object that didn't have any meaning in
normal code (doubt it).

Agreed that if it takes meaning it won't be much different than using
None as a sentinel.  However, it would still not be None, which would
still be worth it.

>
> Better(?) to make it syntax, and overload * yet again:
>
> def f(arg, another_arg, name=*): pass
>
> This has the advantage(?) that inside the body of f, name doesn't get a
> spurious value Deferred. If the caller doesn't supply a value for name, and
> f tries to use it (except as below), then you get an unambiguous
> UnboundLocalError.
>
> The exception is, calling another function with it as an argument is
> permitted. This call in the body of the function:
>
>    result = g(1, 3, name, keyword="spam")
>
> behaves something like this:
>
>    try:
>        name
>    except NameError:
>        result = g(1, 3, keyword="spam")
>    else:
>        result = g(1, 3, name, keyword="spam")
>
> only handled by the compiler.
>
> Seems awfully complicated just to solve a few DRY violations. I'm not sure
> it is worth the added complexity of implementation, and the added cognitive
> burden of learning about it.

Yeah, that is neat, but it is a lot more complicated than having a
builtin singleton that implicitly triggers a simple(?) behavior when
passed as an argument to a function.  You can trust me that my way is
better!  I'm an expert on the inner workings of the CPython
implementation! <wink>  At least, the singleton *seems* conceptually
less complicated to me.

>
> That's not to say that DRY violations in function signatures aren't a real
> problem -- I've run into them myself. But I question how large a problem
> they are, in practice. Language support solving them seems to me to be a
> case of using a sledgehammer to crack a peanut. DRY is mostly a problem for
> implementation, not interface: implementation should remain free to change
> rapidly, and so repeating yourself makes that harder. But interface (e.g.
> your library API, including function signatures) should be static for long
> periods of time, so redundancy in function signatures is less important.

I gotta say I love how practical Python and its community is.  Not
only do things have to be useful, but widely useful and substantially
more useful than the existing way of doing it.  I remember Nick said
something along those lines a few months ago.  I think that attitude
has kept the language awesome.

In this situation, you're probably right that the use case isn't large
enough in practice to be important for the language.  For now I guess
I can do something like this, which is along the lines of what I was
imagining would happen implicitly:

def handle_deferred(f, name, locals_, fname=None):
    # simpler with a function signature object...
    if not fname:
        fname = name
    if locals_[name] is not DEFERRED:
        return locals_[name]
    if name in f.__code__.co_kwonlyargs:
        try:
            return f.__kwdefaults__[fname]
        except KeyError:
            # TypeError would normally be raised if f were called and a
            # non-default parameter were not passed an argument so
            # that's what we'll raise here and below.
            raise TypeError("Can't convert DEFERRED for {}.format(name))
    default_names = f.__code__._co_varnames[
            f.__code__.co_argcount - len(f.__defaults__):
            f.__code__.co_argcount]
    try:
        return f.__defaults__[default_names.index(fname)]
    except ValueError:
        raise TypeError("Can't convert DEFERRED for {}.format(name))

def f(x=5):
    ...

def g(x=DEFERRED):
    ...
    x = handle_deferred(f, "x", locals())
    f(x)


Thanks for having a look.

-eric

>
>
>
> --
> Steven
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



More information about the Python-ideas mailing list