[Edu-sig] Comments on Kay's Reinvention of Programming proposal

Paul D. Fernhout pdfernhout at kurtz-fernhout.com
Tue Mar 20 00:57:38 CET 2007


Alan-

Thanks for the thoughtful reply, including the interesting historical
information on earlier Smalltalk VM sizes and also on performances
losses with newer hardware and OpenGL's origins. Thanks then too for
OpenGL! :-)

I wrote the following long essay, but feel free to ignore it as I'm sure
you have lots of coding still to do, and I am eager to see the results.
:-) Essentially though, I explore whether one can really claim to be
reinventing programming while ignoring tackling head on the issues of
managing complexity and supporting legacy code.

==== On managing complexity and supporting legacy code:

I especially liked your point of: "we want to use "Math Wins!" as the
motto of this project". Certainly Moore's law is making it more possible
for plain general math to win big, without the clutter of a lot of
hand-coded implicit and explicit optimizations (which can be added later
as you propose). And certainly Moore's law is only working to benefit
existing dynamic systems like Smalltalk and Python as any late-binding
performance overhead is becoming less and less important for more and
more tasks.

Also on math, and on rereading your proposal, it is common, if not
essential, for a mathematician to redefine the terms of a problem until
it matches something they know how to solve. Sometimes the result is
useful -- sometimes it is not. In the realm of science, it is generally
always interesting and entertaining though. From:
  http://www.inf.uos.de/elmar/Fun/light2.html
"Q:  How many mathematicians does it take to screw in a light bulb?
 A:  One.  He gives it to six Californians, thereby reducing the problem
to an earlier joke..."

I'll agree that "... the proposal is clear as to its aims, and it is
sketchy as to how the aims will be realized ...".

That of course is the nature of much research. If it was claiming to be
*very* basic research (which the USA has fallen down on), I'm sure it
would be sketchy as to its aims too, :-) but then it would be unlikely
to get funded these days. And that is why one famous academic I know
said he built his basic research career on always submitting grant
proposals for the work he was already doing and was near completing, and
then using the grant money to do something new and unexpected, which
after success became the basis of the next retrospective grant proposal.
:-) But he also built his career back in the 1950s-1970s, when funding
was more plentiful relative to PhDs.
   http://www.its.caltech.edu/~dg/crunch_art.html

I'll also agree with your sentiment that: "The rest of your discussion
is orthogonal to the aims of the proposal."  But, as you also say:
"Basically, we think we know how to do the bottom and the top, but the
most interesting stuff in the center (how "programs" are expressed,
etc.) is currently described in terms of analogies to good ideas rather
than good ideas themselves."

It's the "stuff in the middle" that several of my points are about. From
my experience with other projects, including Squeak, I can agree
wholeheartedly with with this fragment of the larger Dijkstra quote Ian
Piumarta has in his talk slides of: "... we all know that unmastered
complexity is at the root of the misery..."

Of the techniques you outline, which are really directed at managing
complexity? Perhaps some will help sometimes (like programming using
particles and fields for page layout) but they may introduce their own
complexities.

And while ignoring the outside world of legacy code leads to a simple
set of issues for the design, they also lead the community to have
complexity issues of its own maintaining and writing code in all sorts
of areas.

Where is, say, the broad emphasis on wrestling code complexity to the
ground? Or where is, say, a broad emphasis on being able to  easily
interface with, reverse engineer, and maintain any legacy system in any
language on any hardware? Fanciful perhaps to think so big, but is that
not what "reinventing programming" would be about? Especially one that
added in the word "practical" in the subtitle?  Or is the proposal more
really about "reinventing all computers?". :-) which is a great goal,
agreeing with your comment on performance loss by poor design by vendors
of hardware or software, but still is quite a bit different.

I know in practice one needs to draw lines somewhere, and one may have
limited resources, so I am not faulting you on having to draw lines. In
fact, it's remarkable you drew them as far out as you have! Nonetheless,
I'm just playing with the boundaries here -- knowing they are unlikely
to change, but doing so both to understand the nature of the problem
space better and to understand more precisely what this projects
contribution will be, given the provocative title. See:
  http://www.worldtrans.org/pos/infinitegames.html
"There are at least two kinds of games: finite and infinite. A finite
game is a game that has fixed rules and boundaries, that is played for
the purpose of winning and thereby ending the game. An infinite game has
no fixed rules or boundaries. In an infinite game you play with the
boundaries and the purpose is to continue the game."

So, for a proposal focusing on "Reinventing programming", I feel it
perhaps just aims too low (at least in these two areas). Which of course
is ironic since I am sure the biggest problem getting work like this
funded is reviewers thinking you are aiming too high. :-) Granted, you
do have "Steps toward" in the title, so perhaps I just hope for too much
from you all at once. :-) What you really need then is more time and
more money. :-)

Also, these two issues of managing complexity in the middle and
interfacing with a legacy code environment may be "orthogonal" to what
your propose, but they are surely not orthogonal to "reinventing
programming". And I would suggest to you that it is in those very two
areas where Python and Jython with their modularity and libraries and
specific community culture are succeeding, whereas the early Squeak
effort pretty much failed in those areas. (Though since Squeak is a work
in progress, it hasn't really failed -- it's just getting there slowly,
and through a lot of new and painful hard work by the community.) Squeak
has other successes in areas Python has failed (like having "edit and
continue" to code in the debugger and by supporting true message
passing) so this isn't meant to promote one over the other, but I would
hope some new project could build on the strength of *both* systems and
their communities.

One might say that using an idea like message passing everywhere will
simplify things and make management of complexity easier, but as Manuel
de Landa suggests, if this new project does succeed at creating a
homogeneous message passing layer among all software (like if the Linux
Kernel adopts your approach some day :-), complexity at another level
may grow even faster and become even more unmanageable, or in his words:
  http://www.t0.or.at/delanda/meshwork.htm
"To make things worse, the solution to this is not simply to begin
adding meshwork components to the mix. Indeed, one must resist the
temptation to make hierarchies into villains and meshworks into heroes,
not only because, as I said, they are constantly turning into one
another, but because in real life we find only mixtures and hybrids, and
the properties of these cannot be established through theory alone but
demand concrete experimentation. Certain standardizations, say, of
electric outlet designs or of data-structures traveling through the
Internet, may actually turn out to promote heterogenization at another
level, in terms of the appliances that may be designed around the
standard outlet, or of the services that a common data-structure may
make possible. On the other hand, the mere presence of increased
heterogeneity is no guarantee that a better state for society has been
achieved. After all, the territory occupied by former Yugoslavia is more
heterogeneous now than it was ten years ago, but the lack of uniformity
at one level simply hides an increase of homogeneity at the level of the
warring ethnic communities. But even if we managed to promote not only
heterogeneity, but diversity articulated into a meshwork, that still
would not be a perfect solution. After all, meshworks grow by drift and
they may drift to places where we do not want to go. The
goal-directedness of hierarchies is the kind of property that we may
desire to keep at least for certain institutions. Hence, demonizing
centralization and glorifying decentralization as the solution to all
our problems would be wrong. An open and experimental attitude towards
the question of different hybrids and mixtures is what the complexity of
reality itself seems to call for. To paraphrase Deleuze and Guattari,
never believe that a meshwork will suffice to save us. "

And I think for whatever its obvious weaknesses relative to Smalltalk
(like no widespread "edit and continue" :-) the Python community has
excelled at following the path Manuel de Landa references of maintaining
an "... open and experimental attitude towards the question of different
hybrids and mixtures...". And I think  that some good progress can be
made by trying to understand why Python is (relatively) much more
successful in those areas in practice -- as opposed to Squeak which
should be better in theory since the design principals in
Squeak/Smalltalk should define IMHO the better overall architecture
(sorry Guido). On a regular basis, I find myself having to choose
between Squeak and Python, and I invariably choose Python for practical
reasons while still preferring Squeak for conceptual ones (even knowing
the code I write myself would be written much more quickly in Squeak by
a factor of 2X-3X, if for no other reason than keyword syntax and coding
in the debugger). And I am sad about that, since with the exception of
significant indentation, I much prefer Smalltalk syntax and concepts.
So on a practical basis, I am left trying to make Python better, even as
I know, as your proposal says, that message passing is going to be (in
theory) superior for scaling and interoperability than tight gear
meshing from function calls.

Granted, the design you lay out makes such different mixtures of types
of objects and related paradigms quite possible by having a very
flexible implementation system for message passing. As you say, it is
not "Smalltalk" -- but nonetheless I can hope it will build on
Smalltalk's successes -- like continuing to use message passing.

I believe the deep issues will come in trying to manage all that
flexibility. But I just do not see that as an emphasis in the project.
In that sense, I see this project more likely to create new (though
desirable) problems than to solve the difficult ones "professional
programmers" already have when they work on a project of any significant
scope.

One of the beauties of Smalltalk-80 was that it only gave you the
flexibility you really needed (e.g. no arbitrary C-style macros, no
arbitrary processing of messages like earlier Smalltalk-72 was it?).
There was not too much clutter at the language level (ignoring pool
dictionaries and the like) -- and the clutter at the library level was
in theory removable or changeable. In that way, this new proposal may
trend toward violating that elegant principle which has allowed modern
Smalltalkers to keep on top of huge projects. For example, I once spent
a week (while all the in-house developers went away for training :-)
changing tens of thousands of methods written according to dopey C++
get/set naming conventions  in a huge project (e.g. instead of x and x:
the guidelines  promoted by a previous consulting group had people put
getX and setX: everywhere -- what a mess for Smalltalk!). That
conversion was done mostly using mostly using automated tools I wrote of
course, and it was a conversion which was only made feasible by
Smalltalk-80s elegance and introspection capabilities. I'm leary of
approaches toward reinventing programming which might lose that elegance
of Smalltalk-80. Colin Putney discusses a related issue here:
http://www.wiresong.ca/air/articles/2006/08/24/scripting-languages-and-ides
He says: "So, it would appear that we can have either a powerful
language, or powerful tools, but not both at the same time. And looking
around, it’s notable that there are no good IDEs for scripting
languages, but none of the languages that have good IDEs lend themselve
to metaprogramming. There is, of course, one exception. Smalltalk. With
Smalltalk, we have the best of both worlds."

Python is actually an example of a language with various flexible
dispatching methods built-in in some ways, but it becomes a hard thing
to maintain and a system that is becoming ever harder to full understand
as an end user programmer. Too much flexibility at one level goes even
farther down the C++ path -- where C++ is designed to let you do
anything efficiently,y in theory -- but in practice ends up making it
hard to really understand what any piece of code might be doing that it
is rare the programmer can focus enough on a task to do it well.

After I originally read the proposal a month or so ago, I thought on
this issue of arbitrary message dispatchers, but ultimately came up with
the thought that for 99% of the time programming as I do now, I would
not need it and would just have everyones use the same standard
dispatcher. Granted, it may lead to great innovations, which is the
point, and I do hope it does, and may someday lead to people saying,
like they did for structured programming over spaghetti GOTOs, "how did
I get along without it?" But until then, it adds even more complexity,
moving the system actually in a negative direction as far as the core
problem from the Dijkstra quote of managing complexity, and which I see
as perhaps *the* most important real yet difficult problem of
programming day-in and day-out in the trenches. (Theres lots of other
problems in programming, but they are generally all more tractable.. :-)

(I write that last paragraph as someone who spent months trying to make
Python more like Self, but ultimately designed prototypes were in
practice, not all they promised to be, and that plain Python with
classes had many advantages. See my writing on this here:
  "PataPata critique: the good, the bad, the ugly"
http://patapata.sourceforge.net/critique.html
Anyway, that's just to show you I too am looking for solutions. )

If this new project to reinvent programming hits limits, I think it
likely these issues (complexity management and legacy interfacing) will
be the sorts of issues it finds itself wrestling with once the easy
stuff is done (like when you want to really make it easy for a user to
print a physical page from all platforms the software could run on). And
I don't have any easy answers for these issues; perhaps no one does --
yet (or if ever). But I think it is in the direction of exploring that
complexity management issue head on that we will see a complete
reinvention of programming. And yes, obviously, functional programming
and aspect-oriented programming and so on may all be parts of that
puzzle, and to the extent the proposals architecture can easily
encompass all those, it helps with the big picture of managing complexity.

I guess ultimately the goals of making an easy to use GUI (all the way
down) for novices can conflict with the goals of making a system that
allows an experienced user to deal with a vast and complex set of
problems. By "conflict" I don't mean technically; I mean more like in
terms of choosing where to spend limited attention. Which highlights the
difficulty sometimes of following your principle of "Easy things should
be easy; difficult things should be possible."

And while not exactly the same issues, it gets back to the link I
previously had on a "Linus versus Gnome" conflict/exchange:
  http://www.desktoplinux.com/news/NS8745257437.html
>From there: "As the thread continued, it became clear that the
underlying problem is that Torvalds and the GNOME project have
contradictory design goals. Torvalds wants to increase users' access to
the system to give them the maximum possible power, while GNOME aims to
increase the system's usability by making it as easy to use as possible.
For GNOME, this means limiting access to the system from the GUI."

So, part of this issues comes down to audience. Revinventing programming
for whom? For the novice or the high schooler? For the savvy researcher?
Or for the professional programmer? Or for everyone? What does
"personal" mean? These are more rhetorical questions, of course. Even if
you have a specific user in mind, you have every right to make the
system to meet whatever user group you are interested in supporting as
well as to keep that vague; I'm just lobbying for supporting the
personal users who have large complex systems to deal with. :-)

As Kirby Urner said so insightfully in his own reply: "So often in
nature the rules are different enough from level to level that you
wouldn't *want* the same coding language and/or development environment,
let alone concepts, to permeate all levels."

I think that is a very important point this project will no doubt need
to deal with over time -- how "levels" are approached. And perhaps the
same thing is also true for different users, or perhaps different roles
for different users, or different levels of experience, each dealing
with different needs resulting from qualitative differences emerging
from qualitative differences (like the number of objects they must
manipulate, the number of things they already know, their motivation to
overcome frustrations, and so on.).

Still, I also agree wholeheartedly with Kirby when he adds: "... But
that's not to quibble with wanting to go ahead with *the experiment*  ..."

When someone does an experiment with pure reagents, you learn a lot
about general principles. And often, the purer the reagents, the more
certain you are of the general principles learned.

In any case, I think it will be a very interesting experiment indeed,
and well worth doing, even if it perhaps leaves out some things on my
own personal wish list.

I signed up for the Fundamentals of New Computing (fonc) list
   http://vpri.org/mailman/listinfo/fonc
and I'll be curious to see how it progresses. There are a lot of neat
ideas you are working with. I appreciate the invitation in Ian's slides
of: "go home and innovate!
*  built your own and share it with the world
* or use ours: releases every month or two". :-)

The project is surely succeeding already in getting people (myself
included) to think in new ways about reinventing computing. Thanks also
for that. So it is a success already! :-)

All the best.

--Paul Fernhout

Alan Kay wrote:
> Hi Paul --
> 
> Just to clarify a little. The proposal is quite terse via the 15 page
> limit of NSF plus some of their side conditions. It was drawn from a
> much longer "90% finished" internal white paper that I had written to
> try to think things through with the help of some friends. This made its
> way to NSF and last summer they requested that I write a proposal (and
> to my great surprise funded the project late last year).
> 
> Reading it again might help clarity (perhaps after the remarks below).
> 
> This project is a deep exploration to try to learn enough ("Steps toward
> ...") about a number of systems building techniques that have been
> around for a while but have not been put at the center of an entire
> system, so that successful results will furnish much stronger hints
> about how programming could be actually reinvented.
> 
> This project is neither an operating systems project nor a new
> programming language per se, but is primarily architectural. (It has to
> do what OSs do and what programming languages do, but it doesn't have to
> do them like existing systems -- for example, there will not be an
> actual "operating system" in any standard sense).
> 
> The proposal says quite clearly that it is to create the "personal
> computing experience" (PCE) from the end-user down to the metal, and it
> spends the first few pages trying to gist what this means. If you read
> it again, e.g. it doesn't eliminate printing, nor is it primarily about
> OS machinery (but it does require many things to be done including
> TCP/IP, printing, etc.). The PCE as defined in the proposal is a lot of
> stuff that includes UI, many kinds of media, etc. ways to use the
> Internet, and this will be a good and tough target for 20,000 lines of
> code from the end-user down to the metal.
> 
> Naturally we have thought about Moore's Law: it was thinking about this
> that gave rise to the HW and SW personal computing systems we did at
> PARC in the 70s. However, looking at Moore's Law today, we see that it
> has not helped code bloat (and, in fact, has provided "life support" for
> a lot of systems that probably should have been allowed to die). The
> main place we plan to use Moore's Law is an analogy to what we did at
> PARC, namely, to make special machinery that allowed us to (a) do many
> experiments without needing optimization, and (b) to anticipate 10 or
> more years into the future with optimizations. Then we used special
> microcoded computers, today FPGAs are somewhat an equivalent secret weapon.
> 
> An important thing to realize about Moore's Law is that the law as
> stated doesn't really obtain at the systems level. For example, Butler
> Lampson has estimated that today's hardware is about a factor of 1000
> less efficient than the home-brew systems we made at PARC (I've done
> some benchmarks for Smalltalk-80) that indicate at least a factor of 600
> has been lost over the Dorado. This is either 9 or 10 doublings, and
> this is 13-15 years lost because of poor design by vendors (this is
> significant).
> 
> In any case, to say it again, we don't want to use Moore's Law to
> escape, we want to use "Math Wins!" as the motto of this project. We
> want a better approach to architecture at all levels of the system. (I
> helped do some of the original math at Utah that wound its way into
> OpenGL, and the "math of OpenGL" is a good contrast to "the code of
> OpenGL".)
> 
> I've always like FORTH as far as it went (but it never went far enough)
> and so its architecture doesn't overlap with this project. Also, this
> project has nothing to do with Smalltalk or its various architectural
> choices (similar comment as per FORTH). There is no commitment to
> standard notions of objects, classes, messages, etc. in this project.
> (The scaffolding needed to build an arch does not remain after the
> keystone is placed.)
> 
> It's not primarily an engineering project so just how some of the
> criteria are fulfilled are left open to the results of research over the
> next few years, but I wouldn't be completely surprised if the primary
> semantics of this system were not message-oriented. Basically, we think
> we know how to do the bottom and the top, but the most interesting stuff
> in the center (how "programs" are expressed, etc.) is currently
> described in terms of analogies to good ideas rather than good ideas
> themselves.
> 
> All I will say about your discussion of tiny kernels, etc., is that the
> entire original Smalltalk ran in 32K on the Alto, and the kernel for
> ST-78 (for the Notetaker) was only 6KB). But these references, including
> mine, have little to do with the aims of this project (again a
> suggestion for rereading the proposal). This project has nothing much to
> do with past systems, but seeks to try alternative architectural ideas
> (some of which have been around for a while, and most of them not
> thought up by us).
> 
> The rest of your discussion is orthogonal to the aims of the proposal. I
> think we might have the case here, as the Talmud says, "We see things
> not as they are, but as we are". I think the proposal is clear as to its
> aims, and it is sketchy as to how the aims will be realized (this is
> partly the result of unclear writing by me, partly the demands of space,
> and partly that it is a real old-time research project that requires
> quite a bit of invention to succeed). One of the reviewers of the
> proposal took us to task for saying that we didn't know how to do
> everything!


More information about the Edu-sig mailing list