Lookup caching

Andrea Griffini agriff at tin.it
Tue Dec 12 19:02:47 EST 2006


MRAB wrote:

...

> What are you using for the timestamp? Are you calling a function to
> read a timer?

For timestamp I used a static variable; to update the timestamp for
a dictionary I used

       d->timestamp = ++global_dict_timestamp;

I'm using a single counter for all dicts so that when doing the check
for cached value validity I'm checking at the same time that the dict
is the same dict was used and that it wasn't touched since when the
lookup result was stored.

Using this approach and tweaking the LOAD_GLOBAL double lookup for
using a "two timestamps one value" cache I got an 8%-10% speedup
(depending on the compiler options when building python) in a real
application of mines and about 5% in a small test.

I've yet to try more complex applications (the biggest real
application I have however requires a lot of external modules
so testing the speed gain with that will require a lot more work
or just downgrading the python version to 2.4).

Also I'm using an 32 bit int for timestamp... I wonder if I should
listen to the paranoid in my head that is crying for 64 instead.


Andrea



More information about the Python-list mailing list