[Python-ideas] functools.lru_cache manual cache modification

Andrew Barnert abarnert at yahoo.com
Fri Dec 5 17:32:08 CET 2014


On Dec 4, 2014, at 16:49, Constantin Berhard <constantin at exxxtremesys.lu> wrote:

> On 04.12.2014 15:02, Nick Coghlan wrote:
>> As Ethan suggests, a clearer explanation of the current
>> incompatibility, and what additional cache entries you would like to
>> be able to inject may help us better understand the suggestion.
> 
> You're right. I'm sorry for being unclear.
> 
> I now have written code that compares
> a) a stupid custom cache, which offers the interfaces I need ("f")
> b) the most stupid way to combine libtco with lru_cache ("f2")
> c) the slightly less stupid way to combine libtco with lru_cache ("f3")
> 
> A test run computes in this order: f(6), f(6), f(4)
> in a) only the first computation has to be done, the other two are in
> cache afterwards
> in b) nothing can be cached
> in c) only exact arguments can be cached, i.e. f(6) is only calculated
> once, but the intermediate result that was calculated on the way is not
> used to have f(4) in the cache, so f(4) has to be completely recomputed.
> 
> The code is here:
> <http://nopaste.info/a82d680210_nl.html>
> It's a single python 3 script ready for execution. These changes, when
> properly split up will also find their way into the libtco git.
> 
> As you can see from the code, the interface I propose just adds two
> functions to the cache: cache_get and cache_put. I think that any cache
> should be able to provide these functions, no matter how its cache
> strategy or how it is implemented.

You realize that, if this change gets into the 3.5 stdlib, it's still not going to work with any of the many third-party memoization decorators that work today that Nick mentioned, or in 3.4 or earlier, right? So it seems like you're still going to need to provide a workaround--even if that's just including a backport of the current lru_cache in your module or recommending a third-party backport--unless you just want to document that your module requires 3.5+ if people want to also use a memoization cache.

Also, it seems like you could have already written a patch for 3.5, and also put your patched version up on PyPI or included it in your module; since you're going to need to do those things even if you convince everyone, why not do it now? (This would also give you a chance to write up initial wording for the docs patch to explain what these new methods are for, which might help explain your problem, and which would also let people critique the wording early.)

> I'm sorry if my code is tldr, but I really can't think of an easy way of
> explaining it right now. I'll be happy to answer further questions though.
> 
> Best Regards,
> Constantin
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/


More information about the Python-ideas mailing list