How explain why Python is easier/nicer than Lisp which has a simpler grammar/syntax?

Antoon Pardon antoon.pardon at rece.vub.ac.be
Sat Aug 15 10:17:03 EDT 2020


Op 15/08/20 om 15:02 schreef Chris Angelico:
> On Sat, Aug 15, 2020 at 10:45 PM Antoon Pardon
> <antoon.pardon at rece.vub.ac.be> wrote:
>>
>>
>> I don't understand this argument. The trace back information that is
>> destroyed with this optimization, is information that isn't available
>> anyway if you write the code in an iterative fashion.
> 
> Your counter-argument applies only to recursion, but TCO applies to
> *any* tail call. Consider this:
> 
> @some_deco
> def spam(n):
>     ...
>     return spam(n // 2)
> 
> Not a recursive tail call and cannot be rewritten as a loop, unless
> you know for sure that some_deco returns the original function. But
> TCO can still optimize this - by collapsing the stack frames. Which
> loses traceback info, unless you deliberately preserve it.

And how often does this kind of situation come up and destroy important
trace back information? Sure you can come up with artificial examples
like the above, but if the above code gets into trouble because you
run into the recursion limit, you still will have to transform it into
a loop somehow.

>> I think writing code that is at heart a state machine would be a lot more
>> easy if python would have TCO. Then each state could be implemented by
>> a function and transitioning from one state to the next would be just
>> calling the next function.
> 
> I'm honestly not sure how much you'd gain by writing the transitions
> as additional calls. But then, I don't tend to write pure state
> machines in Python. If anything, they're "state machines with
> resumption" or something, and the additional wrinkles mean it's best
> to maintain state in a data structure and maintain code in a function,
> instead of trying to do both on the call stack.

Why are you talking about a stack?

I don't know what best suits your code but my code often enough doesn't
have enough wrinkles to bother with extra data structures.

> 
> ISTM that everyone who begs for tail recursion optimization is trying
> to write loops as recursion. Maybe that's why Python has never
> bothered - because it's an optimization for an inferior way to write
> the same logic. Prove me wrong. Show me something where it actually
> couldn't be better written iteratively, yet it still benefits from the
> optimization.

What standard do you use to measure what is inferior of superior? Sometimes
a state machines is easiest defined as a number of mutual recursive functions
that just tail call each other. So with TCO I could just write it like the
following.

    def state1(...):
        ...
        if ready_for_2:
            return state2(...)
        elif ready_for_7:
            return state7(...)
        elif finished:
            return result

    def state2(...):
        ...

and you just call it like:

    result = state1(...)

Without TCO I am forced to write it as follow:

    def state1(...):
        ...
        if ready_for_2:
            return state2, (...) # notice the comma between function and args.
        elif ready_for_7:
            return state7, (...)
        elif finished:
            return None, result

    def state2(...):
        ...

and then call it like:

    state = state0
    args = ...
    while state is not None:
       state, args = state(*args)
    result = args

I realy don't see what I gain by having this loop or what I could gain by some extra
data structure.

-- 
Antoon Pardon.


More information about the Python-list mailing list