problem with variable scoping

Tim Peters tim_one at email.msn.com
Mon May 31 00:34:54 EDT 1999


[Tim]
>> ...
>> Arbitrary namespace nesting + indefinite extent may be in Python2; a
>> technical hangup with garbage collection prevents it in Python1,
>> unless you simulate it by hand via default-argument abuse (search
>> DejaNews if you're intrigued).

[Moshe Zadka]
> Yes, I know that trick. And actually, IMHO, a better simulation would be
> via classes and that wonderful __call__ special form.
> (Reason: works better with variable length argument lists)

Agreed classes are better (& it can be done with or without __call__), but
for a different reason:  classes in Python were designed to carry state, and
arglist tricks weren't.  So it's a matter of playing along with the basic
design instead of fighting it.  This is a Social Good.

>>> still-waiting-for-arbitary-expressions-in-lambdas-

>> Already there:  they're called "def".

> Oh, come on, you expect me to write:
> def incr(x): return x+1
> l=map(incr, l)
>
> Instead of
> l=map(lambda x: x+1, l)
>
> Ouch!!!!

If you have trivial expressions you want to apply across lists, I "expect
you" to install the NumPy extension and use the array syntax designed for
this purpose.

If you have complex expressions you want to apply across lists, you're much
better off using an explict "for" loop in Python.  Generally much faster,
and for most Python programmers much clearer too.

for x, y in sides:
    hypot = math.sqrt(x**2 + y**2)
    if hypot > max_hypot:
        max_hypot = hypot
        best_pair = x, y

Anyone who wants to write that as a nest of function applications instead is
simply diseased <wink>.

BTW, check out the old-but-not-so-old c.l.py threads on "list
comprehensions".  That's a gimmick borrowed from functional languages (not
Scheme-ish, Haskell-ish) that strikes me as much more Pythonic than lambda
and filter.

>>> and-tail-call-optimization-which-is-possible-in-python-ly y'rs, Z.

>> It's not possible to identify a tail call at compile time ...

> Hey, Tim, you know I put that part in just to cause you to say it
> isn't so I could answer more fully <wink>.
>
> OK: Notice ``tail *call* optimization'', not ``tail *recursion*
> optimization'', which is a special case anyway (and not resembling a blue
> sphere at all).

Doesn't matter, though -- recursive or not, Python can't recognize a tail
call at compile time.  The example I gave was recursive in order to
illustrate that it can't even handle the most obvious case <wink>.

> Whenever the python interpreter sees something like
>
> def f(...):
> 	.
> 	.
> 	.
> 	return g(...)
>
> It could generate code for the stack popping *before*, rather then
> *after* the evaluation of g(...),

1) It doesn't know at compile time that g(...) is a call.  It's either that
or a runtime error, though, so it could pump out a special opcode that
considered only those cases.  For that matter, it already does <wink>.

2) The Python compiler doesn't generate code to pop stack frames.  That's a
small part of what I was trying to get across by saying tail-call
optimization wouldn't do you any good anyway given the current global
structure of the PVM; the latter wasn't designed to support it; it's a
battle.

3) "Stack frames" come in two flavors today, the kind Python explicitly
generates to hold Python info, and the ones implicitly constructed by C to
hold C info.  Python is implemented in C, and *both* kinds of frames get
mixed into the bag during a Python-level call today; there's nothing the PVM
can do today to tail-call away the C frame sitting on the C stack (well, at
least not before C does tail-call optimization too ...).  Again part of the
current global PVM structure.

4) What's the point?  If it's just a matter of not growing the stack depth,
with no speed advantage, hardly anyone would care.  There are real downsides
to doing this too (see below), so almost everyone would care a lot <wink>.

> since f(..) and its parameters are useless (unless, of course, someone
> decides to play with the frames inside g, in which case, he/she deserve
> to get shot)

That's the major downside for me:  playing with f's frame inside g is
*extremely* useful when g raises an unhandled exception.  Think "debugger"
here.  If "f" went missing from the traceback report that would be confusing
enough, but not being able to examine f's locals in a post-mortem analysis
would be a disaster (given that as often as not, it was some local
computation of f's (or something up *its* call chain) that caused a bad
argument to g that caused g to blow up).

The notion that f's locals are dead after it tail-calls to g is true only in
an anal  theoretical sense <wink>; in practice, no local is really dead
until f returns without exception, and keeping f's frame alive reflects the
reality that when things go wrong people desperately do want to look at f's
locals.  This doesn't preclude keeping f's frame alive but unhooking it from
the call stack, though.  There!  We can give you unbounded memory use *and*
confusing tracebacks, all at the same time <wink>.

> Well, my paradigm is ``think in Scheme, write in Python, optimize in C''.
> It sort of works, and though Guido would hate me for that, I'm sad that
> too much gets lost in the translation.

Over time you're likely to think more in Python, and less will get lost.

> though-i-find-i-can-live-w/o-continuations-so-guido-needn't-worry-ly y'rs,
> Z.

See?  You're thinking more in Python already <wink>.

idiomatic-scheme-is-a-treat-and-so-is-idiomatic-python-ly y'rs  - tim






More information about the Python-list mailing list