[Edu-sig] The Urner Approach to CP4E

Kirby Urner pdx4d@teleport.com
Wed, 07 Jun 2000 10:56:08 -0700


>That said, I will come to the defense Kirby Urner's approach to
>connecting math and programming.  If I understand Kirby's approach
>correctly, the objective is not to teach programming, it is to use
>programming as a tool for manipulating abstract concepts.  

That's certainly part of it.

You could say ordinary math notation is likewise a tool for
manipulating abstract concepts.  A typical high school math
book will introduce SIGMA (capital greek letter -- zig zag).
SIGMA works against an indexed list of terms, by summing 
them together.  

For some reason, PI (not lower case pi, which is 3.14159...) 
is rarely introduced at the same time, even though it's a 
completely parallel operator, and works by multiplying, 
instead of summing.  I'd probably introduce both of these
operators at the same time, and then go straight to the 
Python versions:

  >>> from operator import mul,add

  >>> def sum(terms):  return reduce(add,terms)

  >>> def product(terms):  return reduce(mul,terms)

  Usages:
  >>> myterms = [1,2,3,4,5]
  >>> sum(myterms)
  15
  >>> product(myterms)
  120

I don't see the Python way of expressing these operations to
be intrinsically more or less cryptic than using capital 
Greek letters.

We should remind ourselves of the history of curriculum 
writing.  In the old days, early 1900s and before, it was
the norm to learn some ancient Greek and Latin.  Kids were
being drilled in the greek alphabet _anyway_ (i.e. outside
the math classroom context).  But this is no longer the 
case, and the greek alphabet is not being reinforced very
effectively in the "language lab" context.

On the other hand, batching commands into executable scripts
and feeding these to a microprocessor is _very much_ part
of the cultural ambience.  It's going on everywhere, but 
was not at the turn of the century.  So I would argue that
phasing in something like "Python notation" in parallel 
with the more traditional forms is a way to restore 
relevance, to integrate content -- and to fill a void 
left by the demise of ancient Latin and Greek as second
languages (to at least a rudimentary degree of fluency).

In sum, I think there's a prejudice, with a long history, 
which goes something like this: "math books full of geeky 
greek symbols are 'more abstract' and hence 'closer to the
real ideas' than are books full of computer algorithms
written in ASCII".  

Whereas I would agree that the greek-enabled notations 
are often more compressed, and also more cryptic sometimes 
(I use the adjective "crypto-compressed"), I don't think 
computer languages are really all that "second class" 
(in the sense of, once you understand math, you'll throw 
away these training wheels and just use the "straight 
stuff" i.e. the Greek letter operators).

What I'm saying is that from here on out, we're going to
be looking at pre-computer notations and computer languages
as parallel/convergent.  With the advent of unicode, and
the ability to actually deploy Greek letters as operators,
e.g. def SIGMA (<-- greek character goes here), the 
distinction between "standard math notation" and "computer
code" is going to further erode (MathCad is a good example
of this -- already lets users program using a lot of the
standard notations).

Kirby