[Patches] Safe destruction of recursive objects

Christian Tismer tismer@tismer.com
Wed, 08 Mar 2000 17:47:06 +0100


Guido van Rossum wrote:
> 
> > > > > +   Unfortunately, deallocations also take place when
> > > > > +   the thread state is undefined.
...
> I am worried about the possibility of this code being run without the
> interpreter lock held -- it allocates a list and calls
> PyList_Append(), it's not safe to do this when another thread could be
> doing the same thing.  PyList_New() is thread-safe, but
> PyList_Append() to a list reachable via a global is not!
> 
> Should I worry?

Well, we can avoid this completely by using tuples.
No list.append at all, but I simply create a new 2-tuple
for every to-be-deleted that holds
(to-be-deleted, old-tuple)

> Other comments on the patch (which I didn't have the time to review
> earlier):

But they didn't reach me - was it private email?

> - Can we avoid creating new little files for every little feature?
> This stuff should go into object.{c,h}.

Agreed, no problem.

> - Your _PyTrash_deposit_object(op) doesn't check for errors from
> PyList_New() and PyList_Append().  Typically, destructors shouldn't do
> anything that could cause an exception to be raised, because
> destructors may be called while an existing exception is being
> handled.  You'd have to wrap your code in PyErr_Fetch() and
> _Restore(), like classobject.c does when it calls __del__ from a
> deallocator.

I think the tuple way sketched above avoids this and is more
elegant. Although I'm loosing the elegance of appending
a 0-refcount object to a list. :-)

> - Please submit the append compatibility patch separately.

Will do so.

I have one further comment:
While I appreciate the general use of PyArg_ParseTuple for
consitency and clarity, there is a remarkable run-time overhead
for using it in every simple case. There are quite often situations
where you have very easy things like "(O)", and in very simple
functions like len(), the tuple parser consumes over 90% of
the computation time.

What I do in these simple cases is: If there is nothing more
to check than the length of the argument tuple, I use a direct
implementation, like I did in the list.append compatibility.
PyArg_ParseTuple is then simply used to do the *error report*.
This gives you the advantage of not computing obvious cases,
but you still have verbose standard error messages.

I think we (I) could supply a handful of macros for this
purpose (give me quick access but be as verbose on errors
as before) and win a good amount of speedup.

cheers - chris

-- 
Christian Tismer             :^)   <mailto:tismer@appliedbiometrics.com>
Applied Biometrics GmbH      :     Have a break! Take a ride on Python's
Kaunstr. 26                  :    *Starship* http://starship.python.net
14163 Berlin                 :     PGP key -> http://wwwkeys.pgp.net
PGP Fingerprint       E182 71C7 1A9D 66E9 9D15  D3CC D4D7 93E2 1FAE F6DF
     we're tired of banana software - shipped green, ripens at home