[Tutor] Volunteer teacher

Alan Gauld alan.gauld at yahoo.co.uk
Tue Jul 26 17:19:37 EDT 2022


On 26/07/2022 18:19, avi.e.gross at gmail.com wrote:
> The way I recall it, Alan, is that many language designers and users were
> once looking for some kind of guarantees that a program would run without
> crashing AND that every possible path would lead to a valid result. One such
> attempt was to impose very strict typing. 

Yes and many still follow that mantra. C++ is perhaps the most
obsessive of the modern manifestations.

> Programming in these languages rapidly became tedious and lots of people
> worked around them to get things done. 

Yep, I first learned to program in Pascal(*) which was super strict.

####### Pseudo Pascal - too lazy to lok up the exact syntax!  ###

Type
   SingleDigit = INTEGER(0..9);

funtion f(SingleDigit):Boolean
....

var x: INTEGER
var y: SingleDigit
var b: Boolean

begin
  x := 3
  y := 3

  b := f(x)   (*Fails because x is not a SingleDigit *)
  b := f(y)   (* Succeeds because y is... even though both are 3 *)
end.

The problem with these levels of strictness is that people are
forced to convert types for trivial reasons like above. And every
type conversion is a potential bug. In my experience of maintaining
C code type conversions (especially "casting") are one of the
top 5 causes of production code bugs.

(*)Actually I did a single-term class in programming in BASIC in the
early 70s at high school, but the technology meant we didn't go beyond
sequences, loops and selection. In the mid 80s at University I did a
full two years of Pascal. (While simultaneously studying Smalltalk
and C++ having discovered OOP in the famous BYTE magazine article)

> the other direction and did thigs like flip a character string into a
> numeric form if it was being used in a context where that made sense.

Javascript, Tcl, et al...

> My point, perhaps badly made, was that one reason some OOP ideas made it
> into languages INCLUDED attempts to do useful things when the original
> languages made them hard. Ideas like allowing anything that CLAIMED to
> implement an INTERFACE to be allowed in a list whose type was that
> interface, could let you create some silly interface that did next to
> nothing and add that interface to just about anything and get around
> restrictions. But that also meant their programs were not really very safe
> in one sense.

Absolutely but that's a result of somebody trying to hitch their
particular programming irk onto the OOP bandwagon. It has nothing
whatsoever to do with OOP. There were a lot of different ideas
circulating around the 80s/90s and language implementors used
the OOP hype to include their pet notions. So lots of ideas
all got conflated into "OOP" and the core principles got lost
completely in the noise!

> address of a pointer to an integer. Many languages now simply make educated
> guesses when possible so a=1 makes an integer and b=a^2 is also obviously an
> integer 

Java does a little of this and Swift is very good at it.

> including some things like JAVA, is their attempt to create sort of generic
> functions. 

But generics are another topic again...

> ...There is an elusive syntax that declares abstract types that are
> instantiated as needed and if you use the function many times using the
> object types allowed, it compiles multiple actual functions with one for
> each combo. So if a function takes 4 arguments that call all be 5 kinds, it
> may end up compiling as many as 625 functions internally.

True, that's also what happens in C++. But it is only an issue at
assembler level - and if you care about the size of the executable
which is rare these days. At the source code level the definitions are
fairly compact and clear and still enables the compiler to do strict
typing.

> trivially restrict  what can be used and may have to work inside the
> function to verify you are only getting the types you want to handle.

Although, if you stick to using the interfaces, then you should be able
to trust the objects to "do the right thing". But there is a measure of
responsibility on the programmer not to wilfully do stupid things!

> I am getting to understand your viewpoint in focusing on ideas not so much
> implementations and agree. The method of transmitting a message can vary as
> long as you have objects communicating and influencing each other. Arguably
> sending interrupts or generating events and many other such things are all
> possible implementations. 

Absolutely and in real-time OOP systems it's common to wrap the OS
interrupt handling into some kind of dispatcher object which collects
the interrupt and determines the correct receiver and then sends the
interrupt details as a message to that object. And from a purely
theoretical systems engineering viewpoint, where an OOP system is
a form of a network of sequential machines, interrupts are about
the closest to a pure OOP architecture.

-- 
Alan G
Author of the Learn to Program web site
http://www.alan-g.me.uk/
http://www.amazon.com/author/alan_gauld
Follow my photo-blog on Flickr at:
http://www.flickr.com/photos/alangauldphotos




More information about the Tutor mailing list