garbage collection could force destructors?

Alex Martelli aleaxit at yahoo.com
Fri Nov 3 11:01:01 EST 2000


"Pete Shinners" <pete at visionart.com> wrote in message
news:8tnlim$34n$1 at la-mail4.digilink.net...
> the talk about destructors got me thinking. now that garbage collection is
> a base part of python2, does that make it possible to define some behavior
> for object destructors? i know before there were many cases where an
object
> might never be deleted, so is was impossible to guarantee the destructors.

As far as I know, instances of classes that define __del__ (the
destructor method) do not participate in the garbage collection
of mutual-reference cycles.  I have not studied those sources,
but I saw this point mentioned in discussions in this group.

The problem of the order of execution of destructors proved,
apparently, too hard to tackle (at least in this release).
I sympathize...!


> now with garbage collection, it seems all objects could be caught and
> destructed in a logically-defined order. is this possible? i haven't

I think there is no generally useful 'order' one could pick, given
the unrestricted ability to pass references around, copy them wherever
you wish, create cycles of mutually referencing objects -- and this
unrestricted ability (everything first-class, no limitations, &c)
makes up a large part of Python's power.

Among objects without cycles of mutual reference, by definition
of such cycles, a DAG of references exist.  When you have
determined that an object X must go, you could then determine
the sub-DAG of objects that will eventually go as a result
of X's disappearance (_assuming_ a __del__ method cannot
change references -- a rather heavy assumption, btw).  Such
determination will have its substantial cost, but say that
we're willing to pay it (including the sub-cost of
determining that a cycle _does_ exist, and so our needs are
different).  The constraints given by references in the
sub-DAG ensure that not all 'topological sorts' (ordering
of destructors) are acceptable, but there will be several.

What usefulness is there in further constraining the
destruction-order, beyond what is implied by the existence
of references in the sub-DAG?  What 'further constraints'
would you impose (always keeping in mind that any
constraint 'A before B' *cannot* be respected if you
can reach A from B by following references in the sub-DAG,
i.e. B is before A in the partial ordering that the
references determine)...?

"``Sometimes'' (if all other circumstances permit doing so
and it's not too hard to prove it possible) objects with
characteristic A will be destroyed before objects with
characteristic B" does not sound to me like a 'guarantee'
that buys me anything much... what would I _do_ with such
a (necessarily heavily conditional) "fuzzy guarantee"?

Don't forget the fact that such dubiously-useful 'further
constraints' WILL constrain the garbage collection process.
For example, if they were phrased so as to make it impossible
to have GC proceed 'generationally', then a _huge_ potential
optimization would be thrown to the winds... for what gain?


> dug to deep to understand the gc, but if it is now possible to do
> something about this, i think everyone would be better off.

Suppose it's _possible_ to add some (fuzzy) further
constraints of the above form, which will slow down
object-destruction (by the need to determine the sub-DAG
to which they apply), impose strict conditions on what
a __del__ can do (no mucking around with references),
impede a future optimization towards generational GC
(perhaps impede a port to .NET right now, since I think
.NET's GC _is_ generational right from the start), and
still probably be too fuzzy and conditional to let you
base any specific architecture choices on destruction
order except maybe in a few very, very special cases.

Do you really think _anybody_ would be better off then...?


Alex






More information about the Python-list mailing list