[Types-sig] updated proposal (fwd)

scott scott@chronis.pobox.com
Thu, 6 Jan 2000 13:28:48 -0500


On Thu, Jan 06, 2000 at 11:46:26PM +1100, skaller wrote:
> scott wrote:
>  
> > If I declare somevar as having type 'Any' it means that it is the Any
> > type (which in itself doesn't enforce anything other than the variable
> > cannot be used in a way that asserts a particular type).  When
> > something is not declared, it can be deduced from a relatively
> > predictable algorithm, but that deduced type is one which allows the
> > more permissive 'one of a or b or ... or n'.  Declared types seem more
> > useful as 'always any of a or b or ... or n' to me.  If we wanted the
> > 'one of...' behavior in type declarations, I guess I'd be more inclined to
> > call that 'xor' than 'or'.  Given a choice between the two, I'd prefer
> > the 'always any of...' semantics.
> 
> 	The question is not a matter of your personal choice
> but logic/mathematics.
> 
> 	Consider a function:
> 
> 	f : Int or String -> Int or String
> 
> 
> The function is correct only if it accepts MORE inputs
> than the static declaration.
> 
> For the return, it is the other way around: the function
> must return LESS than the specified types: for example,
> always an Int would be OK. But returning a float
> would not be.
> 
> So, when type checking the FUNCTION, the interpretation
> of the static type declarator for the argument and
> the return value is different. [In fact, dual: the comparison
> is reversed]
> 
> On the other hand, checking a _call_,
> as opposed to a function: what can be passed in is LESS 
> than the specification. But the client can apply a procedure
> to the return value that works for MORE types than the specification.
> 
> My point: the 'behaviour' of a static interface declaration
> depends on context. There is no single interpretation.

I see this now, and presented what I was trying to get at in a
subsequent post much more along these lines.  Yes, it depends on
context as you state.  There's an important piece of context that
works in greg's web page (and the last version of the code I looked
at) which causes that type system to defer a whole class of checks
until run time.  The context is within code blocks where variables are
found to be LESS by means of a type-assertion.  While this is great
when applied to variables which have no explicit type declaration
associated with them, it weakens the position of the system at the end
of compile time when it is applied to explicitly declared types.

scott