Method chaining

Rotwang sg552 at hotmail.co.uk
Sun Nov 24 10:01:55 EST 2013


On 24/11/2013 14:27, Steven D'Aprano wrote:
> On Sat, 23 Nov 2013 19:53:32 +0000, Rotwang wrote:
>
>> On 22/11/2013 11:26, Steven D'Aprano wrote:
>>> A frequently missed feature is the ability to chain method calls:
> [...]
>>> chained([]).append(1).append(2).append(3).reverse().append(4) =>
>>> returns [3, 2, 1, 4]
>>
>> That's pretty cool. However, I can imagine it would be nice for the
>> chained object to still be an instance of its original type.
>
> Why?

Well, if one intends to pass such an object to a function that does 
something like this:

def f(obj):
     if isinstance(obj, class1):
         do_something(obj)
     elif isinstance(obj, class2):
         do_something_else(obj)

then

 >>> f(chained(obj).method1().method2())

looks nicer than

 >>> f(chained(obj).method1().method2().obj)

and usually still works with the version of chained I posted last night. 
This isn't a wholly hypothetical example, I have functions in my own 
software that perform instance checks and that I often want to pass 
objects that I've just mutated several times.


> During the chained call, you're only working with the object's own
> methods, so that shouldn't matter. Even if a method calls an external
> function with self as an argument, and the external function insists on
> the original type, that doesn't matter because the method sees only the
> original (wrapped) object, not the chained object itself.
>
> In other words, in the example above, each of the calls to list.append
> etc. see only the original list, not the chained object.
>
> The only time you might care about getting the unchained object is at the
> end of the chain. This is where Ruby has an advantage, the base class of
> everything has a method useful for chaining methods, Object.tap, where in
> Python you either keep working with the chained() wrapper object, or you
> can extract the original and discard the wrapper.
>
>
>> How about something like this:
>>
>> def getr(self, name):
>>       obj = super(type(self), self).__getattribute__(name)
>
> I don't believe you can call super like that. I believe it breaks when
> you subclass the subclass.

Yes. I don't know what I was thinking with the various super calls I 
wrote last night, apart from being buggy they're completely unnecessary. 
Here's a better version:

class dummy:
     pass

def initr(self, obj):
     object.__setattr__(self, '__obj', obj)
def getr(self, name):
     try:
         return object.__getattribute__(self, name)
     except AttributeError:
         return getattr(self.__obj, name)
def methr(method):
     def cmethod(*args, **kwargs):
         try:
             args = list(args)
             self = args[0]
             args[0] = self.__obj
         except (IndexError, AttributeError):
             self = None
         result = method(*args, **kwargs)
         return self if result is None else result
     try:
         cmethod.__qualname__ = method.__qualname__
     except AttributeError:
         pass
     return cmethod

class chained(type):
     typedict = {}
     def __new__(cls, obj):
         if isinstance(type(obj), chained):
             return obj
         if type(obj) not in cls.typedict:
             dict = {}
             for t in reversed(type(obj).__mro__):
                 dict.update({k: methr(v) for k, v in t.__dict__.items()
                             if callable(v) and k != '__new__'})
             dict.update({'__init__': initr, '__getattribute__': getr})
             cls.typedict[type(obj)] = type.__new__(cls, 'chained%s'
                 % type(obj).__name__, (dummy, type(obj)), dict)
         return cls.typedict[type(obj)](obj)



More information about the Python-list mailing list