I sing the praises of lambda, my friend and savior!

Dave Benjamin ramen at lackingtalent.com
Mon Oct 11 20:49:29 EDT 2004


In article <10mm6o6si5mdqac at corp.supernews.com>, Jeff Shannon wrote:
> In this case, I'd redefine after to be:
> 
>     def after(timeout, func, *args, **kwargs):
>         # whatever needs to be done to delay the call, then...
>         func(*args, **kwargs)

Apparently, you're not the only one. ;)

> I find this to read slightly better, and the implementation requires one 
> less level of indirection.  I'll admit that I haven't explored much in 
> the area of function currying and closures, so there *may* be ways in 
> which using lambda is preferable to using a generic variable-argument 
> closure-returning function, but at least in the cases that spring to my 
> mind, lambda's advantage is slim at best.

Python has many features that can delay evaluation and factor out
repetition, and this can largely explain why it's not so much of a nuisance
that it lacks a more powerful closure construct. For instance, on the dev
list, as quickly as the new "sort(key=lambda x: x[1])" idiom was introduced,
people started hunting for ways to squash the lambda, the winners being
"operator.attrgetter()" and "operator.itemgetter()". It is very likely that
anything you can do with a "lambda" you can also do using other features of
Python. Still, I think it's useful to note that the lambda version was the
first to be discussed, and the search for alternatives was in part motivated
by the threat of lambda's future demise. As long as lambda is there, people
will still use it for certain tasks, and sorting seems to be one of those
tasks.

The most convincing argument for (more powerful) lambdas over named
functions is in the design of custom control structures. For example, let's
say you wanted to define a do-while loop in Python. You could do it like this:

def do_while(proc, test):
    while True:
        proc()
        if not test():
            break
                                
i = 0
def print_and_increment_i():
    global i
    print i
    i += 1
def while_i_lt_10():
    return i < 10
do_while(print_and_increment_i, while_i_lt_10)

With code blocks, you could implement something like this instead:

i = 0
do({||
    print i
    i += 1
}, while={||
    return i < 10
})

As I write this, I feel a bit embarrassed; neither is really all that
readable, to be honest. But I'm going to post it anyway because I think it
illustrates my basic point, which is that the ability to use inline
functions allows you to implement control structures that would otherwise be
awkward and out-of-order. Real-world uses for custom control structures are
often application-specific and don't make much sense out of context. Say
you're working within a framework where you need to manage acquiring locks
and opening and closing resources:

lock.open({||
    resource.acquire({|res|
        res.dilly()
        res.dally()
    })
})

This would ensure that the lock gets closed even if the resource cannot be
acquired, and if everything works out, the resource gets released first, and
then the lock gets closed. The alternative people keep proposing would
result in code like this:

def do_while_resource_acquired(res):
    res.dilly()
    res.dally()

def do_while_lock_open():
    resource.acquire(do_while_resource_acquired)

lock.open(do_while_lock_open)

Notice how the code now reads backwards?

> And I'm not very happy with the decorator syntax either, to be honest.  

I'm not really convinced that their usefulness demands syntax extension. On
the other hand, I've seen concrete examples in Ruby and other languages that
demonstrate the usefulness of code blocks, and in those cases, I think
syntax extension is worthwhile. But we all have different projects with
different needs, so one (wo)man's syntax extension is another's ugly wart.

Without macros, it's hard to please everyone. And with macros, it's hard to
please everyone. ;)

> Intermediate anonymous values make sense at a certain level of 
> granularity but not at all levels.  Practicality beats purity. 

And one (wo)man's practicality is another's impurity.

> On the other hand, I'd never write your first example -- I'd be 
> explicity about the grouping with extra parens.
> 
>    c = math.sqrt( (a*a) + (b*b) )
> 
> It's not syntactically necessary, because it's following standard 
> operator precedence, but it scans much more quickly for me. 
> 
> Indeed, maybe that's what this is all about.  To my mind, when I see a 
> lambda, I already have to stop scanning, push a level on my mental 
> stack, figure out what the heck is going on with the lambda and what 
> it's going to return, and then pop back to the line in progress.  In 
> contrast, when I'm scanning something using a (descriptively) named 
> function, the name should tell me enough of the purpose of the function 
> that I don't need to pause to figure it out.  Thus, it's much quicker 
> for me to scan a reference to a named function than to scan a lambda 
> definition.

Yes, I can understand and sympathize with this viewpoint.

> ISTM that many of these uses can be accomplished by using named 
> functions with variable arguments.   I'll admit that I'm no language 
> expert, and TBH haven't used any of those languages.  But my impression 
> is that the use of (named) function references, dynamic typing, and 
> convenient variable argument syntax can acheive most of the benefits 
> that are touted for lambdas. 

And the use of wrenches, pliers, and mallets can achieve most of the
benefits that are touted for hammers. All of which happen to be objects. ;)

>>(Yet, somehow, decorators slipped through, even though nobody agreeed on a
>>syntax for that either. I don't have a rational explanation for that...)
>
> Nor do I .....  *sigh*

*coughjavacough*

-- 
 .:[ dave benjamin: ramen/[sp00] -:- spoomusic.com -:- ramenfest.com ]:.
        "talking about music is like dancing about architecture."



More information about the Python-list mailing list