Implementation of an lru_cache() decorator that ignores the first argument

Heinrich Kruger heindsight at kruger.dev
Fri Sep 30 14:32:17 EDT 2022


On Thursday, September 29th, 2022 at 07:18, Robert Latest via Python-list <python-list at python.org> wrote:


> Hi Chris and dh,
> 
> thanks for your --as usually-- thoughtful and interesting answers. Indeed, when
> doing these web applications I find that there are several layers of useful,
> maybe less useful, and unknown caching. Many of my requests rely on a
> notoriously unreliable read-only database outside of my control, so I cache the
> required data into a local DB on my server, then I do some in-memory caching of
> expensive data plots because I haven't figured out how to reliably exploit the
> client-side caching ... then every middleware on that path may or may not
> implement its own version of clever or not-so-clever caching. Probably not a
> good idea to try and outsmart that by adding yet another thing that may break
> or not be up-to-date at the wrong moment.
> 
> That said, the only caching that SQLAlchemy does (to my knowledge) is that it
> stores retrieved DB items by their primary keys in the session. Not worth much
> since the session gets created and dumped on each request by SQA's unit of work
> paradigm. But the DB backend itself may be caching repeated queries.
> 
> Back to Python-theory: The "Cloak" object is the only way I could think of to
> sneak changing data past lru_cache's key lookup mechanism. Is there some other
> method? Just curious.
> 
> --
> https://mail.python.org/mailman/listinfo/python-list

You could use closures. For example, something like this:

    import functools
    import time


    def my_cache(timeout):
        start = time.monotonic()

        def cache_decorator(func):
            wrapper = _my_cache_wrapper(func, timeout, start)
            return functools.update_wrapper(wrapper, func)

        return cache_decorator


    def _my_cache_wrapper(func, timeout, start):
        first = None

        @functools.cache
        def _cached(timeout_factor, *args):
            print("In the _cached function")
            return func(first, *args)

        def wrapper(*args):
            print("In the wrapper")
            nonlocal first
            first, *rest = args

            elapsed = time.monotonic() - start
            timeout_factor = elapsed // timeout

            return _cached(timeout_factor, *rest)

        return wrapper


    @my_cache(3)
    def expensive(first, second, third):
        print("In the expensive function")
        return (first, second, third)


    if __name__ == "__main__":
        print(expensive(1, 2, 3))
        print()
        time.sleep(2)
        print(expensive(2, 2, 3))
        print()
        time.sleep(2)
        print(expensive(3, 2, 3))

This should output the following:

    In the wrapper
    In the _cached function
    In the expensive function
    (1, 2, 3)

    In the wrapper
    (1, 2, 3)

    In the wrapper
    In the _cached function
    In the expensive function
    (3, 2, 3)


It's not necessarily better than your version though. :D


Kind regards,
Heinrich Kruger


More information about the Python-list mailing list