RFC: Proposal: Deterministic Object Destruction

Ooomzay ooomzay at gmail.com
Sun Mar 4 19:58:38 EST 2018


On Sunday, 4 March 2018 15:24:08 UTC, Steven D'Aprano  wrote:
> On Sun, 04 Mar 2018 03:37:38 -0800, Ooomzay wrote:
> 
> > Please consider the case of a composite resource: You need to implement
> > __enter__, __exit__ and track the open/closed state at every level in
> > your component hierarchy - even if some levels hold no resources
> > directly.
> > This is burdensome, breaks encapsulation, breaks invariance and is error
> > prone ...very unpythonic.
> 
> Without a more concrete example, I cannot comment on these claims.

Here is an example of a composite resource using RAII:-

class RAIIFileAccess():
    def __init__(self, fname):
        print("%s Opened" % fname)
    def __del__(self):
        print("%s Closed" % fname)

class A():
    def __init__(self):
        self.res = RAIIFileAccess("a") 

class B():
    def __init__(self):
        self.res = RAIIFileAccess("b")

class C():
    def __init__(self):
        self.a = A()
        self.b = B()
                 
def main():
    c = C()

Under this PEP this is all that is needed to guarantee that the files "a" 
and "b" are closed on exit from main or after any exception has been handled.

Also note that if you have a reference to these objects then they are 
guaranteed to be in a valid/useable/open state (invariant) - no danger
or need to worry/check about enter/exit state.

Now repeat this example with "with".

> [...]
> > My PEP is about improving the linguistic integrity and fitness for
> > resource management purpose of the language.
> 
> So you claim, but *requiring* reference counting semantics does not 
> improve the integrity or fitness of the language. 

We will just have to disagree on that for now.

> And the required 
> changes to programming styles and practices (no cycles, 

No change required. But if you _choose_ to benefit from RAII you had better 
not create orphan cycles with RAII objects in them, as that 
is clearly a resource leak.

> no globals, 

No change required. But if you _choose_ to benefit from RAII you had better 
take care to delete any RAII resources you choose to hold at global scope in 
a robust way. (These are exceptional in my experience).

> put all resources inside their own scope) 

No change required. But if you _choose_ to benefit from RAII you can make use
of python's existing scopes (functions) or del to restrict resource lifetimes.

> >> In any case, you might not like with statements, but I think they're
> >> infinitely better than:
> >> 
> >> def meaningless_function_that_exists_only_to_manage_resource():
> >>     x = open_resource()
> >>     process(x)
> > 
> >> def function():
> >>     meaningless_function_that_exists_only_to_manage_resource()
> >>     sleep(10000)  # simulate a long-running function
> > 
> > Why would you prefer a new construct?
> 
> I don't prefer a new construct. The "with" statement isn't "new". It goes 
> back to Python 2.5 (`from __future__ import with_statement`) which is 
> more than eleven years old now. That's about half the lifetime of the 
> language!
> 
> I prefer the with statement because it is *explicit*, simple to use, and 
> clear to read. I can read some code and instantly see that when the with 
> block ends, the resource will be closed, regardless of how many 
> references to the object still exist.
> 
> I don't have to try to predict (guess!) when the last reference will go 
> out of scope, because that's irrelevant.

If you don't care about what the other references might be then 
RAII is not for you. Fine.
 
> RAII conflates the lifetime of the object with the lifetime of the 
> resource held by the object. 

This "conflation" is called "invariance" and is usually considered a 
"very good thing" as you cant have references floating around to
half-baked resources.

> They are not the same, and the object can 
> outlive the resource.

Not with RAII it can't. Simple. Good. 

> Your position is:
> 
> "RAII makes it really elegant to close the file! All you need to do is 
> make sure that when you want to close the file, you delete all the 
> references to the file, so that it goes out of scope, and the file will 
> be closed."
> 
> My position is:
> 
> "If I want to close the file, I'll just close the file. Why should I care 
> that there are zero or one or a million references to it?"

Because if you have no idea what references there are you can not assume it 
is OK to close the file! That would be a truly terrible program design.

> >> - the with block is equivalent to a try...finally, and so it is
> >>   guaranteed to close the resource even if an exception occurs; your
> >>   solution isn't.

It is: RAII will release all resources held, transitively. Try the example
above using CPython and put an exception in one of the constructors. 

> >> If process(x) creates a non-local reference to x, and then raises an
> >> exception, and that exception is caught elsewhere, x will not go out of
> >> scope and won't be closed.
> >> A regression in the reliability of the code.

> > This PEP does not affect existing code. Peeps who are familiar with RAII
> > understand that creating a global reference to an RAII resource is
> > explicitly saying "I want this kept open at global scope" and that is
> > the behaviour that they will be guaranteed.
> 
> I'm not talking about global scope. Any persistent reference to the 
> object will prevent the resource from being closed. 
> 
> Here's a proof of concept which demonstrates the problem with conflating 
> object scope and resource lifetime. Notice that there are no globals used.
> 
> def main():
>     from time import sleep
>     values = []
>     def process():
>         f = open('/tmp/foo', 'w')
>         values.append(f)
>         f.write("Hello world!")
>         f.read()  # oops!
>         del values[-1]
>     try:
>         process()
>     except IOError:
>         pass
>     # The file should be closed now. But it isn't.

That's because you have failed to clean up your exception frame 
after handling it (sys.exc_clear() or whatever depending on version)
and it is holding a reference to your resource, keeping it open till 
end of the function.

>     sleep(10)  # simulate doing a lot of work
>     g = open('/tmp/foo', 'r')
>     assert g.read() == "Hello world!"
> 
> 
> The assertion fails, because the file hasn't be closed in a timely 
> manner. On the other hand:
> 
> def main2():
>     from time import sleep
>     values = []
>     def process():
>         with open('/tmp/bar', 'w') as f:
>             values.append(f)
>             f.write("Hello world!")
>             f.read()  # oops!
>             del values[-1]
>     try:
>         process()
>     except IOError:
>         pass
>     sleep(10)  # simulate doing a lot of work
>     g = open('/tmp/foo', 'r')
>     assert g.read() == "Hello world!"
> 
> 
> The assertion here passes.
> 
> Now, these are fairly contrived examples, but in real code the resource 
> owner might be passed into an iterator, or bound to a class attribute, or 
> anything else that holds onto a reference to it. As soon as that happens, 
> and there's another reference to the object anywhere, RAII will be unable 
> to close the resource in a timely manner.

There is no way you should be closing a resource while
other references to it exist. RAII will not forgive such a design. 

This is a good thing.



More information about the Python-list mailing list