[Python-porting] Strange behavior of eval() in Python 3.1

Bo Peng ben.bob at gmail.com
Sun Aug 29 15:42:47 CEST 2010


Dear all,

This might be the wrong mailinglist to ask, but I bumped into a
strange problem when I ported my code from Python 2.7 to 3.1. More
specifically, my code involves the evaluation of expressions in a lot
of local namespaces (dictionaries). For example, the following code
evaluates expressions in a dictionary dd under Python 2.7:

Python 2.7 (r27:82525, Jul  4 2010, 09:01:59) [MSC v.1500 32 bit
(Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> dd = {'a': {1: 0.1, 2: 0.2}}
>>> print(eval('a[1]', {}, dd))
0.1
>>> print(eval("[x for x in a.keys()]", {}, dd))
[1, 2]
>>> print(eval("[a[x] for x in a.keys()]", dd, dd))
[0.1, 0.2]
>>> print(eval("[a[x] for x in a.keys()]", {}, dd))
[0.1, 0.2]

However, the last statement fails under Python 3.1

Python 3.1.2 (r312:79149, Mar 21 2010, 00:41:52) [MSC v.1500 32 bit
(Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> dd = {'a': {1: 0.1, 2: 0.2}}
>>> print(eval('a[1]', {}, dd))
0.1
>>> print(eval("[x for x in a.keys()]", {}, dd))
[1, 2]
>>> print(eval("[a[x] for x in a.keys()]", dd, dd))
[0.1, 0.2]
>>> print(eval("[a[x] for x in a.keys()]", {}, dd))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<string>", line 1, in <module>
  File "<string>", line 1, in <listcomp>
NameError: global name 'a' is not defined
>>>

Because a is in clearly accessible from the passed dictionary, I have
no idea why it has to be defined in the global dictionary. Does anyone
know what is going on here? How can I change make it work with Python
3?

Many thanks in advance.
Bo


More information about the Python-porting mailing list