Caching function results

Martin A. Brown martin at linux-ip.net
Thu Mar 3 15:59:23 EST 2016


Greetings Pavel,

> Suppose, I have some resource-intensive tasks implemented as 
> functions in Python. Those are called repeatedly in my program. 
> It's guranteed that a call with the same arguments always produces 
> the same return value. I want to cache the arguments and return 
> values and in case of repititive call immediately return the 
> result without doing expensive calculations.

Great problem description.  Thank you for being so clear.

[I snipped sample code...]

This is generically called memoization.

> Do you like this design or maybe there's a better way with 
> Python's included batteries?

In Python, there's an implementation available for you in the 
functools module.  It's called lru_cache.  LRU means 'Least Recently 
Used'.

> I'd also like to limit the size of the cache (in MB) and get rid 
> of old cached data. Don't know how yet.

You can also limit the size of the lru_cache provided by the 
functools module.  For this function, the size is calculated by 
number of entries--so you will need to figure out memory size to 
cache entry count.

Maybe others who have used functools.lru_cache can help you with how 
they solved the problem of mapping entry count to memory usage.

Good luck,

-Martin

 [0] https://docs.python.org/3/library/functools.html#functools.lru_cache

-- 
Martin A. Brown
http://linux-ip.net/



More information about the Python-list mailing list