[Python-ideas] (no subject)

Steven D'Aprano steve at pearwood.info
Wed May 6 17:48:11 CEST 2015


On Wed, May 06, 2015 at 06:59:45AM -0700, Andrew Barnert via Python-ideas wrote:

> Python functions don't just take 1 parameter, they take any number of 
> parameters, possibly including optional parameters, keyword-only, 
> *args, **kwargs, etc.

Maybe Haskell programmers are used to functions which all take one 
argument, and f(a, b, c) is syntactic sugar for f(a)(b)(c), but I doubt 
anyone else is. When we Python programmers manually compose a function 
today, by writing an expression or a new function, we have to deal with 
the exact same problems. There's nothing new about the programmer 
needing to ensure that the function signatures are compatible:

def spam(a, b, c):
    return a+b+c

def eggs(x, y, z):
    return x*y/z

def composed(*args):
    return eggs(spam(*args))  # doesn't work

It is the programmer's responsibility to compose compatible functions. 
Why should it be a fatal flaw that the same limitation applies to a 
composition operator?

Besides, with Argument Clinic, it's possible that the @ operator could 
catch incompatible signatures ahead of time.


> There are a dozen different compose 
> implementations on PyPI and ActiveState that handle these differently. 

That is good evidence that this is functionality that people want.

> Which one is "right"?

Perhaps all of them? Perhaps none of them? There are lots of buggy or 
badly designed functions and classes on the internet. Perhaps that 
suggests that the std lib should solve it right once and for all.


> The design you describe can be easily implemented as a third-party 
> library. Why not do so, put it on PyPI, see if you get any traction 
> and any ideas for improvement, and then suggest it for the stdlib?

I agree that this idea needs to have some real use before it can be 
added to the std lib, but see below for a counter-objection to the PyPI 
objection.


> The same thing is already doable today using a different 
> operator--and, again, there are a dozen implementations. Why isn't 
> anyone using them?

It takes a certain amount of effort for people to discover and use a 
third-party library: one has to find a library, or libraries, determine 
that it is mature, decide which competing library to use, determine if 
the licence is suitable, download and install it. This "activiation 
energy" is insignificant if the library does something big, say, like 
numpy, or nltk, or even medium sized.

But for a library that provides effectively a single function, that 
activation energy is a barrier to entry. It's not that the function 
isn't useful, or that people wouldn't use it if it were already 
available. It's just that the effort to get it is too much bother. 
People will do without, or re-invent the wheel. (Re-inventing the wheel 
is at least fun. Searching PyPI and reading licences is not.)


> Thinking in terms of function composition requires a higher level of 
> abstraction than thinking in terms of lambda expressions.

Do you think its harder than, say, the "async for" feature that's just 
been approved by Guido?

Compared to asynchronous code, I would say function composition is 
trivial. Anyone who can learn the correspondence

    (a @ b)(arg)  <=> a(b(arg))

can deal with it.


> Python doesn't have a static optimizing compiler that can avoid 
> building 4 temporary function objects to evaluate (plot @ sorted @ 
> sqrt @ real) (data_array), so it will make your code significantly 
> less efficient.

Why would it necessarily have to create 4 temporary function objects? 
Besides, the rules for optimization apply here too: don't dismiss 
something as too slow until you've measured it :-)

We shouldn't care about the cost of the @ operator itself, only the cost 
of calling the composed functions. Building the Composed object 
generally happens only once, while calling it generally happens many 
times.


> Is @ for composition and () for application really sufficient to write 
> point free code in general without auto-curried functions, operator 
> sectioning, reverse compose, reverse apply, etc.? Most of the examples 
> people use in describing the feature from Haskell have a (+ 1) or (== 
> x) or take advantage of map-type functions being (a->b) -> ([a] -> 
> [b]) instead of (a->b, [a]) -> [b].

See, now *that's* why people consider Haskell to be difficult: it is 
based on areas of mathematics which even maths graduates may never have 
come across. But function composition is taught in high school. (At 
least in Australia, and I expect Europe and Japan.) It's a nice, simple 
and useful functional tool, like partial.


-- 
Steve


More information about the Python-ideas mailing list