[Python-Dev] buglet in unary ops on classes.

Guido van Rossum guido@beopen.com
Sun, 30 Jul 2000 11:39:57 -0500


> While poking around looking for something else, I found this little buglet:
> 
> >>> class X:
> ...     def __getattr__(self, val):
> ...             print "getattr:", val
> ...             raise AttributeError
> 
> >>> x = X()
> >>> ~x
> getattr: __invert__
> Traceback (most recent call last):
>   File "<stdin>", line 1, in ?
>   File "<stdin>", line 4, in __getattr__
> AttributeError
> 
> The unary ops all do this (raising AttributeError), but the binary ops do
> not:
> 
> >>> x+1
> getattr: __coerce__
> getattr: __add__
> Traceback (most recent call last):
>   File "<stdin>", line 1, in ?
> TypeError: __add__ nor __radd__ defined for these operands
> 
> Shouldn't this translation be done for unary ops as well ? Is it safe to
> assume that instance_getattr() only returns NULL if the attribute is not
> found ? PyInstance_DoBinOp() takes this assumption wrt. PyObject_GetAttr(),
> and I don't really see a problem with it.

In general, I'm against changing exceptions.  There are situations
conceivable where an AttributeError from a custom __getattr__ is
caused by a bug in that __getattr__ or in something it calls, and it's
important not to destroy the traceback in that case.

The TypeError for a binary operator violates this principle.  This is
because the coercion process has to try several things that may fail
with an AttributeError.  Saving the tracebacks is just too much work.

--Guido van Rossum (home page: http://www.pythonlabs.com/~guido/)