[Python-ideas] Tweaking closures and lexical scoping to include the function being defined

Ron Adam ron3200 at gmail.com
Sun Oct 2 19:05:17 CEST 2011


On Sun, 2011-10-02 at 09:38 -0400, Nick Coghlan wrote:
> On Sun, Oct 2, 2011 at 2:29 AM, Ron Adam <ron3200 at gmail.com> wrote:
> > On Sat, 2011-10-01 at 22:11 -0400, Nick Coghlan wrote:
> >
> > +1 on all of the zen statements of course.
> >
> > I think you made a fine case for being careful and mindful about this
> > stuff.  :-)
> 
> Heh, even if nothing else comes out of these threads, I can be happy
> with helping others to learn how to look at this kind of question from
> multiple angles without getting too locked in to one point of view
> (and getting more practice at doing so, myself, of course!)

Yes, that's a very zen way to look at it. +1

Keeping that larger picture in mind, while sorting though various
smaller options is challenging.  Hopefully in the end, the best
solution, (which may include doing nothing), will be sorted out.


> > One way to think of this is, Private, Shared, and Public, name spaces.
> > Private and Public  are locals and globals, and are pretty well
> > supported, but Shared names spaces, (closures or otherwise) are not well
> > supported.
> >
> > I think the whole concept of explicit shared name spaces, separate from
> > globals and locals is quite important and should be done carefully.  I
> > don't think it is just about one or two use-cases that a small tweak
> > will cover.
> 
> "not well supported" seems a little too harsh in the post PEP 3104
> 'nonlocal' declaration era.

I think the introspection tools you want will help.
 
hmm... what about the vars() function?   (I tend to forget that one)

vars(...)
    vars([object]) -> dictionary
    
    Without arguments, equivalent to locals().
    With an argument, equivalent to object.__dict__.

Could we extend that to see closures and scope visible names?


> If we look at the full suite of typical
> namespaces in Python, we currently have the following (note that
> read/write and read-only refer to the name bindings themselves -
> mutable objects can obviously still be modified for a reference that
> can't be rebound):
>
> Locals: naturally read/write
> Function state variables (aka default argument values): naturally
> read-only, very hard to rebind since this namespace is completely
> anonymous in normal usage
> Lexically scoped non-locals: naturally read-only, writable with
> nonlocal declaration
> Module globals: within functions in module, naturally read-only,
> writable with global declaration. At module level, naturally
> read/write. From outside the module, naturally read/write via module
> object
> Process builtins: naturally read-only, writable via "import builtins"
> and attribute assignment
> Instance variables: in methods, naturally read/write via 'self' object
> Class variables: in instance methods, naturally read-only, writable
> via 'type(self)' or 'self.__class__'. Naturally read/write in class
> methods via 'cls', 'klass' or 'class_' object.
>
> Of those, I would put lexical scoping, function state variables and
> class variables in the 'shared' category - they aren't as contained as
> locals and instance variables, but they aren't as easy to access as
> module globals and process builtins, either.

I think it may be easier to classify them in terms of how they are
stored.

Cell based names spaces:
    function locals
    function closures

Dictionary based names spaces:
    class attributes
    module globals
    builtins


If vars() could get closures,  What exactly would it do and how would
the output look?

Would it indicate which was free variables, from cell variables?


[clipped literal parts, for now]

> > While its very interesting to try to find a solution, I am also
> > concerned about what this might mean in the long term.  Particularly we
> > will see more meta programming.  Being able to initiate an object from
> > one or more other objects can be very nice.  Python does that sort of
> > thing all over the place.
> 
> I'm not sure I understand what you mean in your use of the term
> 'meta-programming' here. The biggest danger to my mind is that we'll
> see more true process-level globals as state on top-level functions,
> and those genuinely *can* be problematic (but also very useful, which
> is why C has them). It's really no worse than class variables, though.

I'm thinking of automated program generation. A programming language
that has a lot of hard syntax without a way to do the same things in a
dynamic way makes it harder to do that.  You pretty much have to resort
to exec and eval in those cases.  Or avoid those features.


> The other objection to further enhancing the power of functions to
> maintain state is that functions aren't naturally decomposable the way
> classes are - if an algorithm is written cleanly as methods on a
> class, then you can override just the pieces you need to modify while
> leaving the overall structure intact. For functions, it's much harder
> to do the same thing (hence generators, coroutines and things like the
> visitor pattern when walking data structures).

I would like very much for functions to be a bit more decomposable. 


> My main counters to those objections are that:
> 1. Any feature of this new proposal can already be done with explicit
> closures or the default argument hack. While usage may increase
> slightly with an officially blessed syntax, I don't expect that to
> happen to any great extent - I'm more hoping that over time, the
> default argument hack usages would get replaced

The part I like is that it removes a part of functions signatures, which
I think are already over extended.  Although, only a small part.

> 2. When an algorithm inevitably runs up against the practical limits
> of any new syntax, the full wealth of Python remains available for
> refactoring (e.g. by upgrading to a full class or closure)

I agree, any solution should be compared to an alternative non-syntax
way of doing it.

All very interesting...

Cheers,
   Ron
 





More information about the Python-ideas mailing list