[Tutor] Global/Local confusions

John Abbe johnca@ourpla.net
Tue Mar 11 02:21:01 2003


Thanks for engaging me on this. My questions continue...

At 11:15 PM +0100 2003-03-04, Magnus Lycka wrote:
>John Abbe wrote:
>>In my continuing effort to understand global / local interactions --
>
>For instance, the following will print 5 and then 3:
>
>x = 3
>def f():
>     x = 5
>     print x
>f()
>print x
>
>Ok? I assume you have no problem with this. Since x is
>defined in the local scope, a lookup of the name x will
>find the local variable x, not the global variable x,
>right? Back in the global scope, we will find the global
>variable x, which is not the same.
>
>Scopes always have clear boundries. The local scope of
>a function contains the "def...:" line and all the
>following indented lines. A local scope won't start at
>any arbitrary line in the file.
>
>So, if you swap the lines in the function body above,
>like this:
>
>x = 3
>def f():
>     print x
>     x = 5
>f()
>
>You will still have a local variable x in the local scope
>of the function f. This variable will hide the global
>variable x just as in the first example. In other words, you
>can't see the global x while you are inside the definition of
>function f. Since you do 'print x' before you do 'x = 5', you
>are trying to print something that you don't (yet) know the
>value of. Thus the error message. The name (x here, b in your
>example) is defined in the local scope, but at this point, it
>hasn't been bound to any object yet.

Sometimes the parser decides i mean a global variable even when i do 
not explicitly label it. E.g.:

x = [3]
def f():
    if x[0] == 3:
       print "spam"

f()

This assumption-of-global-ness seems to be limited to cases when 
there is no plain assignment to the variable anywhere in the local 
context. Why not extend the assumption-of-global-ness to cases when 
assignment only occurs after any reference?

>If you intend to use the global variable x inside a function
>where you make assignments to it, you must say so explicitly:
>
>x = 3
>def f():
>     global x
>     print x
>     x = 5
>f()

Obviously the "must" is not quite true, given my example above. 
Making it an absolute requirement might well be cleaner.

To confuse things further, while Python complains about plain assignment:

x = [3]
def f():
    if x[0] == 3:
       print "spam"
       x = [3, 5]

f()
x

...assignment by object method is okay:

x = [3]
def f():
    if x[0] == 3:
       print "spam"
       x.append(5)

f()
x

Why? And is there anything else i should know about recognizing when 
a variable will be auto-recognized as global?

>This might seem silly in a tiny program, but when you are at
>line 2376 of a large file, and the global variable was defined
>at line 72, you will be very happy that Python behaves just
>the way it does...
>
>Imagine this case:
>
>x = 3
>
>def f(i):
>     if i == 'x':
>         ...
>         x = None
>         ...
>     elif i is None
>         ...
>         x = 6
>     ...
>     print x
>     ...
>
>f()
>
>In this case, if i is neither 'x' or None, you will
>get an UnboundLocalError again. For me it would certainly
>be strange if the same "print x" would either print a
>local or a global variable depending on how the if-
>statements evaluated. The current implementation does a
>lot to help programmers stay sane in the long run..

I definitely agree that conditionally interpreting a variable as 
global or local would not be desirable. But in this case, Python 
could see that assignment is possible before reference, and assume 
that i meant x to be a local variable. Any problem with that?

Life,
John
-- 
  All you  /\/\ John Abbe           "If you don't like the news,
      need \  / CatHerder            go out and make some of your own."
      is... \/  http://ourpla.net/john/                --Wes Nisker