Will python never intend to support private, protected and public?

Paul Rubin http
Sat Oct 1 04:16:40 EDT 2005


Steven D'Aprano <steve at REMOVETHIScyber.com.au> writes:
> A cautionary tale of what happens when religious wars enter programming
> debates. For all I know, Paul Rubin is intelligent, gentle, kind to
> animals and small children, generous, charitable and modest.

Don't bet on it.

> But touch his religious belief in the necessity of "truly" private
> variables, and boy oh boy does he lose the plot.

I haven't claimed necessity (except in some specific types of
applications).  I've said it's a useful feature in a certain type of
development process.  So I think you're the one losing the plot and
acting like a religious whacko.  If there's a feature you don't want
to use, nobody is making you use it.  You're going off the deep end
saying nobody else should be allowed to use it either.

> Paul misses the point: in a language like Python that allows the dynamic
> modification of public methods (as well as private ones) you can't be
> sure with absolute 100% mathematical certainty that your public interface
> is the same at runtime as it was when you did your last code audit.

Obviously any privacy-enforcement mechanism has to cope with that.
Think of the situation with Bastion, back when it existed and was
believed to sort-of work.

> If your development team uses test-driven development, you will catch this
> breakage immediately you check the code in. If you don't, you've got
> lots more bugs that you don't know about, so one more is no big deal.

Methinks you have too much faith in testing.  It only shows that your
program handles the test cases that you give it.  To show that the
program handles all the other cases, you need actual logic.

> > Why on earth would you want to unnecessarily introduce random bugs
> > into a word processor or anything else?  
> 
> Paul puts words into my mouth: I never suggested that it was good to
> introduce bugs, unnecessarily or otherwise. What I said was that aiming
> for 100% mathematical certainty that there are no bugs in a word processor
> is pointless. No lives are at stake. The fate of nations doesn't rest on
> your ability to make sure that PyWord is 100% bug free. Relax dude, the
> world won't end if there is a bug or two in your code.

See:  http://vowe.net/archives/005838.html  
(found with about 5 seconds of Google searching).

People could very well have died over disclosures like that, whether
caused by software bugs or (as in this instance) by unlucky
interaction of intentional features.  These things happen over and over.  

If you look at the NSA's initial release of the Skipjack algorithm
spec, it was a PDF file containing a bitmap that was obviously a scan
of a fax of a document that had come out of a word processing program.
People chuckled that a mighty operation like the NSA could be so
technologically backward that they couldn't just put the word
processor file directly online.  Later they realized that maybe the
NSA was simply being careful, that the document had been prepared in a
classified environment and they didn't trust their word procssing
software to not leak classified info into the file, so they stuck to a
policy of only moving data like that around on paper and not in
computer files.

Information leakage from word processors is a serious problem.

> Of course, if Paul had been paying attention, he would have realised that
> 100% certainty is not possible in general no matter what style of
> development you use. Perhaps folks have heard of the halting problem?

You don't know what you're talking about.  That is a red herring.

> You can reduce the probability of bugs by using best practices, but never
> reduce it to zero. How small do you need? Is it really shocking to suggest
> that while a one in a billion billion chance of a bug might be
> needed for a nuclear reactor, we might be satisfied with just one in a
> million for a word processor?

In some ways, word processors are harder to write safely than nuclear
reactor code.  Nuclear reactors usually run in carefully controlled
environments and the field is mature enough that you can have some
reasonable confidence that the data going into the program (e.g. from
sensors) will be similar to the data that you tested the program with.
What you're left with is Murphy's Law: something unexpected could
happen by accident and you can make some reasonable assumptions about
the distribution of inputs to make sure the probability is low.  Ross
Anderson calls this "programming Murphy's computer".

A word processor, on the other hand, has to deal with malicious input,
that can be fiendishly crafted in ways that the implementer never
dreamed of.  It's the difference between worrying about being hit by a
meteor at random, and worrying about being hit by a meteor because
someone in the asteroid belt is steering them towards you on purpose.
Anderson calls this "programming Satan's computer".

> Should we decide that C is not suitable for critical applications?

Yes, absolutely.  C (by which term I include C++) is unsuitable for
critical applications including word processors.  C is woefully
underspecified, it has no type safety, yada yada.  It's even worse
than Python.  People write those applications in C today anyway only
because they are irresponsible and/or insane.  Look at all the buffer
overflow exploits in Microsoft Office that keep appearing every day of
the week.  If Office had not been written in C, those bugs wouldn't
have anything like such bad consequences.

> In any case, there is a difference between mission-critical applications
> (such as, perhaps, a word processor) and what are generally called "real
> time" critical systems such as those controlling nuclear power stations
> and the like. I doubt that Python runs on any real-time operating
> systems suitable for such critical systems, but perhaps I'm wrong.

I don't see any point to that.  Critical is critical whether it's
real-time or not.  If you write an inventory management program and
someone uses it to route medical supplies into a disaster area, it can
kill people.



More information about the Python-list mailing list