eval and dict

Steffen Ries steffen.ries at sympatico.ca
Thu May 22 07:52:39 EDT 2003


Hi,

I am trying to implement lazy evaluation of user provided expressions.

I'm using "eval(expression, dictionary)" with multiple calls
evaluating different expressions using the same context. It turned out
that creating the dictionary up front is a performance bottleneck.

Since I typically don't need all possible value in the dictionary, I
was trying to defer creating the entries until an expression actually
needs the entry.

I tried to implement a delegating dictionary, but "eval(exp,
lazyDict)" complains "TypeError: eval() argument 2 must be dict, not
instance"

I was using a class like this:

class LazyDict:
    def __init__(self, source):
        self.source = source
        self.dict = {}
    def __getitem__(self, key):
        if self.dict.has_key(key):
            return self.dict[key]
        # lookup key in self.source, cache entry in self.dict[key]
        # return real value

For my second try I was extending dict, but that did not work
either. In this case I got around the TypeError, but my overloaded
__getitem__ was not called, e.g. (just to demonstrate, I drop the
pretense of getting the real value)

class LazyDict(dict):
    def __init__(self):
        dict.__init__(self)
    def __getitem__(self, key):
        print "__getitem__", key
        return 1

>>> d = LazyDict()
>>> eval('a+b', d)
Traceback (most recent call last):
  File "<stdin>", line 1, in ?
  File "<string>", line 0, in ?
NameError: name 'a' is not defined
>>> d['a']
__getitem__ a
1

Is there a way to bypass this behavior of eval()?

At the end of the day, I need to do this in Jython, where it actually
works the way I expect it, but I would like to understand what's going
on.

TIA
/steffen
-- 
steffen.ries at sympatico.ca	<> Gravity is a myth -- the Earth sucks!




More information about the Python-list mailing list