The type/object distinction and possible synthesis of OOP and imperative programming languages

rusi rustompmody at gmail.com
Thu Apr 18 23:35:17 EDT 2013


On Apr 19, 3:53 am, Mark Janssen <dreamingforw... at gmail.com> wrote:
> On Mon, Apr 15, 2013 at 2:53 AM, Moez AbdelGawad <moeza... at outlook.com> wrote:
> >> I'm not quite sure I understand your question, but I'll give it a shot.
> >> :-)
>
> > I'm in this same camp too :)
>
> I am very thankful for the references given by everyone.
> Unfortunately my library does not have the titles and it will be some
> time before I can acquire them.  I hope it not too intrusive to offer
> a few points that I've garnered from this conversation until I can
> study the history further.


You may want to see this: http://www.infoq.com/presentations/Functional-Thinking


>
> The main thing that I notice is that there is a heavy "bias" in
> academia towards mathematical models.

Yeah wonderful observation. Lets clean up!

If I have a loop:

 while i < len(a) and a[i] != x:
   i++

I need to understand that at the end of the loop:
i >= len(a) or a[i] == x
and not
i >= len(a) and a[i] == x
nor
i == len(a) or a[i] == x  # What if I forgot to initialize i?

Now why bother to teach students such a silly thing (and silly name)
as deMorgan?

So all hail to your project of cleaning up the useless math from CS.
And to whet your appetite for the grandeur and glory of your
visionings why not start with making music schools enroll tone-deaf
students?  Why wasn't Beethoven deaf?

>  I understand that Turing
> Machines, for example, were originally abstract computational concepts
> before there was an implementation in hardware, so I have some
> sympathies with that view, yet, should not the "Science" of "Computer
> Science" concern itself with how to map these abstract computational
> concepts into actual computational hardware?  Otherwise, why not keep
> the field within mathematics and philosophy (where Logic traditionally
> has been)?   I find it remarkable, for example, that the simple
> continued application of And/Or/Not gates can perform all the
> computation that C.S. concerns itself with and these form the basis
> for computer science in my mind, along with Boolean logic.  (The
> implementation of digital logic into physical hardware is where C.S.
> stops and Engineering begins, I would argue.)

You need to study some history (or is that irrelevant like math?)
The Turing who invented the Turing machine in 1936 led the code-
cracking efforts of the allies a couple of years later.
Do you allow for the fact that he may have had abilities that were
common to both aka 'math' 'theory' etc?
Or do you believe that winning wars is a theoretical and irrelevant
exercise?

>
> But still, it seems that there are two ends, two poles, to the whole
> computer science enterprise that haven't been sufficiently *separated*
> so that they can be appreciated:  logic gates vs. logical "calculus"
> and symbols.   There is very little crossover as I can see.  Perhaps
> the problem is the common use of the Greek root "logikos"; in the
> former, it pertains to binary arithmetic, where in the latter, it
> retains it's original Greek pertaining to *speech* and symbols,
> "logos").


Yes there is some truth in what you say.  Just call it logic as object-
language (what you call logic-gates) and logic as meta-language ie
logic for reasoning
[the above line is not sarcastic]



> Further, one can notice that in the former, the progression
> has been towards more sophisticated Data Structures (hence the
> evolution towards Object-Orientation), where in the latter (I'm
> guessing, since it's not my area of expertise) the progression has
> been towards function sophistication (where recursion seems to be
> paramount).

Also good to study the views of one of the doyens of OOP:
http://en.wikipedia.org/wiki/Alexander_Stepanov#Criticism_of_OOP



More information about the Python-list mailing list