proposal for slice hashing

Chris Angelico rosuav at gmail.com
Mon May 11 16:10:28 EDT 2020


On Tue, May 12, 2020 at 6:01 AM Will Bradshaw <defenastrator at gmail.com> wrote:
> The only options as of now are:
>     1. use 3 layers of wrappers to pack the slices into a custom type that supports hashing pass this mangled version of the arguments through lru_cache wrapper into a function that reverses the process of the first wrapper and passes through to the underlying implementation. (see below "code for workaround" as example)
>        - This is kinda jank and arguably slow. Though in my case the cost of the calculation operation dwarfs this cost by an several orders of magnitude.
>        - mapping may be unreliable and is a rather long and impenetrable mess.
>     2. implementing my own custom caching for this situation which does not scale well and is a heck of a lot of work.
>     3. implement a special case for slices in the lru_cache function. However, this is just moving the problem into the functools library.
>

4. Implement __getitem__ as a wrapper around a caching lookup function
that simply takes the three arguments.

def __getitem__(self, slice):
    return generate_values(slice.start, slice.stop, slice.step)

@lru_cache
def generate_values(start, stop, step):
    ...

Not sure if this makes it easy enough to not worry about the hashability.

ChrisA


More information about the Python-list mailing list