[Types-sig] Non-conservative inferencing considered harmful

Paul Prescod paul@prescod.net
Wed, 22 Dec 1999 17:00:02 -0600


Greg Stein wrote:
> 
> BUT!! -- you never said what the error in the program was, and what the
> type-checker was supposed to find:
> 
> 1) was "a" filled in with inappropriate types of values?
> 2) was "j" assigned a type it wasn't supposed to hold?
> 3) was "k" declared wrong?
>
> In the absence of knowing which of the three cases is wrong, I *strongly*
> maintain that the error at the "k = j" assignment is absolutely correct.
> How is the compiler to know that "a" or "j" is wrong? You didn't tell it
> that their types were restricted (and violated).

That's right. This is all valid Python code and should *not* give an
error message. That's why I'm been trying to make a distinction between
a type safety declaration and a type declaration. If you didn't ask for
this code to be type-safe then it won't cause a problem. Here's where
the error might arise:


... 10,000 lines of code ...

for el in a:
   j = el.doSomething()

... 10,000 lines of code ...

type-safe
def foo( k: Int ):
	k = j

> In other words, the error message is NOT "miles away". It is exactly where
> it should be. When that error hit, the programmer could have went "oops! I
> declared k wrong. I see that j is supposed to be an Int or a String.
> okay... lemme fix that..."

No, here's what really happens (based on my experience with ML):

"Oooops. Something weird has happened why would it expect an int or
string? Where does this variable get its value? Humm. What are the
possible types of el? Humm, what are the possible contents of a? Hummm.
Why does this language make it so hard for me to find my errors?"

> Find another example -- this one doesn't support your position that
> inferencing is harmful.

It absolutely does. If I can't jump immediately from the error message
to the line that causes the problem then something is wrong with the
type system. I shouldn't have to "debug" compiler errors by inserting
declarations here and there.

> Sure. And in the absence of saying so, the inferencer above did exactly
> what it was supposed to do. You have one of three problems in your code,
> yet no way for any compiler to know which one you meant.

It shouldn't have let me get so far without saying *exactly what I
mean*. It shouldn't have tried to read my mind. I'm the human asking for
the computer's help in keeping things straight. If it tries to read my
mind it's going to be making the same mistakes I make! Rather it should
tell me when I've first done something fishy.

> I hate the thought that people are going to start feeling that they should
> put declarations into their code. 

That's *inevitable*. People who want statically type-checked code are
inevitably going to start feeling that they must put declarations in
their code. That's the case with ML. That's the case with Haskell. That
will be the case with Python. We aren't smarter here then all of the
programming language researchers in the world.

Just as programmers mistrust the ML and Haskell inferencers, they will
distrust the Python inferencer. Just as programmers (me, Tim, the editor
of the Journal of Functional Programmers) always put in declarations in
ML and Haskell code, we will do so for Python. One or two cryptic error
messages that take you fifteen minute to "debug" are enough to turn you
off inferencing really quick.

Therefore our real decision is: do we want to force programmers who want
static type checking to sprinkle their code with declarations
*explicitly* or do we want to wait until they get frustrated with the
inferencer? It seems to me that "simple and explicit" is more Pythonic
than "we'll guess what you mean and you can being explicit if we guess
wrong." If you don't want to put in declarations, don't use static type
checking!

> We will lose one of the best things of
> Python -- the ability to toss out and ignore all that declaration crap
> from Algol-like languages. If you give a means to people to declare their
> variables, then they'll start using it. "But it's optional!" you'll say.
> Well, that won't be heard. People will just blindly follow their C and
> Java knowledge and we will lose the cleanliness of syntax that Python has
> enjoyed for so long.

I don't see the absence of type declarations as having much to do with
Python's cleanliness. As Tim said, declarations improve readability by
serving as documentation.

 Paul Prescod