[Python-Dev] PEP280 and my experiment

Samuele Pedroni pedronis@bluewin.ch
Sat, 14 Jun 2003 17:33:17 +0200


At 08:46 14.06.2003 -0400, Phillip J. Eby wrote:
>At 03:15 AM 6/14/03 +0200, Samuele Pedroni wrote:
>
>>this is probably far from ideal for closures, OTOH with the right 
>>infrastructure it should be possible to store created caches e.g. in code 
>>objects and so reuse them.
>
>Perhaps I'm misunderstanding, but wouldn't code objects be a bad place to 
>put a cache, since the same code object can be used for more than one 
>function object, each with different globals?  This would be problematic 
>for code that uses 'exec code in dict' to load scripting code into a 
>restricted execution space, for example.

in that case the cache would be recreated each time, which is what was 
suggested to do anyway in the first place (maybe my previous mails were 
uncomprehensible) but my experimental code does already the right thing:

 >>> c = compile("def f(): print x\nf()","<>","exec")
 >>> exec c in {'x': 1}
alloc cache
cache in code
caching
1
release my cache ref
 >>> exec c in {'x': 2}
alloc cache
cache in code
caching
2
release my cache ref

This would happen if 'exec code in dict' is used to populate with a same 
set of definitions different namespaces. I think it's rare, another 
solution (less hackish) would be to associate with dicts used for globals a 
code -> cache mapping. This is maybe a bit more costly for the common case.

Anyway both the code -> cache mapping approach or the cache stored in code 
approach are better than recreating caches from scratch for closures. 
Storing caches in code is what my experiment does:

Python 2.3b1+ (#102, Jun 12 2003, 22:35:51)
[GCC 2.96 20000731 (Red Hat Linux 7.1 2.96-98)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
 >>> 0 # sets _
0
 >>> def r(f):
...   return lambda l: reduce(f,l)
...
 >>> import operator
 >>> r(operator.add)([3,2])
alloc cache
cache in code
caching
release my cache ref
5
 >>> r(operator.mul)([3,2])
fast load cache
using cache
release my cache ref
6
 >>> r(operator.sub)([3,2])
fast load cache
using cache
release my cache ref
1
 >>>

obviously storing caches also inside functions when they are created is 
even better but alone would not address the closure case.

regards.