Why python???

Alex Martelli aleax at aleax.it
Sat Sep 6 11:17:24 EDT 2003


Michael Peuser wrote:
   ...
>> > (1) More power for same price - this works in favour of 'complex
> software'
>> > (2) Same power for less money - this works in favor of 'cheap software'
>>
>> And software is developed more cheaply in higher level languages than
>> in lower level ones.  Therefore, both [1] and [2] work in favour of
>> HLL's, which let you develop more complex software for the same price or
>> the same software (software of the same functional complexity, but more
>> often than not somewhat hungrier of hardware resources) faster & cheaper.
> 
> No one will ever dispute this. 

Oh good, because your assertions DID "dispute this" -- you very
specifically claimed that there was no such thing helping Python
as Moore's Law helped Linux!!!

> The question is however: How much HLL is enough? 

That may be ONE question, and I'll gladly try to answer it if and
when a language with a substantially higher level than Python is
alternative for a project about which I get input.  In general it's
not trivial to stay BOTH very-high-level AND general purpose enough
for a really wide range of tasks.  A language we're developing at
AB Strakt as part of the CAPS framework, for example, looks like...:

   attribute salary is Currency
       "Salario mensile"
       restriction quantity(1)
       
       on create code Python
           if value[0] < minimumSalary:
               value = [ minimumSalary ]
           return value
       %%

etc -- now THAT is a higher-level language than Python, and it's
a dream to use for directly transcribing into executable code the
results of ERD analysis for typical business applications to be
deployed as multi-layer clients/middleware/server distributed apps;
but I wouldn't even DREAM of using it to program a completely
different kind of application, such as a videogame, a compiler,
a web-spider engine, and the like, all application areas for which
Python, which is MUCH more general-purpose, is so perfectly well
suited.  Note that this language uses Python as its lower-level
embedded language; indeed the whole VVHL (very-very high-level)
program gets compiled into Python (by a Python-coded compiler, of
course) and executed as Python (by a Python-coded middleware, of
course -- resting on top of a very special Python-coded OODB and
feeding clients such as GUI ones, Web apps, and batch scripts --
all typically coded, of course, in Python).

No "height" is "enough", PER SE, in a sufficiently restricted
domain -- and "multiuser business apps centered on cooperation
of knowledge workers" turns out to be sufficiently restricted
to let us at AB Strakt design a special language for it, one
which catches the essentially "declarative" aspects of the field
and embeds Python for non-declarative business rules such as the
above "ensure salary is >= minimumSalary by raising it if needed".

Of course, if and when we find out that this kind of on-create
action (business rule) is frequent enough to warrant it, we may
want to push the language level of our own declarative language
up a bit further (perhaps by some kind of macro mechanism) into
    ensure minimum(minimumSalary)
(just guessing at the hypothetical future syntax here) in much
the same way as the existing "restriction" statement expresses
constraints that are meant to make a commit attemot fail rather
than fixing it somehow.

But below our VVHL language we'll always need a totally general
purpose one; our language is not meant to compile itself and
never will (while Python will, see the pypy project).  I am not
sure how much higher than Python you can go *while staying fully
general-purpose* -- or if it would be wise to.  We'll see when
and if somebody invents such a language with a FP level of, say,
roughly one or two lines per FP (Python is at about 4-5, I would
guess).  _Simplicity_ is also important, as such relative
failures as PL/I and Algol68 show; Python's magic is that it
gets to be so powerful AND simple at the same time -- whether
that feat is repeatable at much higher semantics levels, I dunno.
(NB in this context FP=Function Point -- nothing to do with
Functional Programming, as is happens).


> Why are purely functional languages practically unknown?

What's that gotta do with the issue?  Neither ML nor Haskell
are substantially different from Python in terms of FP language
level.  My personal guess is that using functional programming
effectively requires a certain mathematical mindset that most
people just don't have.  Using Python doesn't -- "it fits your
brain", as amply demonstrated over and over already.


> [Some absulutely convincing remarks and well structured destription of the
> current situation on the server market by Alex]
> 
> My original point in answer to Facundu Bastidas' remark was: IF the
> hardware costs had not dropped down to the same order as (base!) software
> costs, than Linux would not have had any chance! This is still what I
> think with my zero knowledege.

It's weird to argue a conterfactual hypothetical, but I don't see why
(e.g.) IBM couldn't have cottoned on to Linux as a cheaper and more
profitable alternative for their system integration business than
developing and maintaining their own sub-brand of Unix, whatever may
have happened to hardware costs.


> My other point was that there nothing like that is going on in the
> software development business. The price of tools is of no significance.

You are wrong.  Maybe on some level it "shouldn't" be, but bean
counters are going really wild these days -- and new frugal startup
companies (very different mindset than those of just a few years ago)
ARE looking hard at EVERY penny they spend as a matter of routine.
Each penny has to justify itself directly -- a rule that served such
now-huge companies as Walmart or Ikea quite well over the years,
after all.

Furthermore, using open source tools isn't just a COST advantage: it
gives obvious strategic benefits -- it lets you have a backup strategy
that you might lack if the sole supplier of your tools went belly-up,
or got bought out by some other company for its customer base (but
with the strategy of slowly killing the tools that are making you its
customer), or simply decided to kill your tools in favour of some
costlier/more profitable others (suppliers DO that, you know, if they
get a good enough stranglehold on their customers).

Some companies have suffered enough such episodes in the past that
they DO have a strategic bias towards open source by now.


> It is the (expected increase of productivity) The DoD Ada is the best
> example I can think of.
> 
> We have not yet reached the situation, where the vendor includea all the
> hardware you need when you buy a program.

Actually there are such cases (although "buy" is typically incorrect --
such sw/hw combinations are typically *leased*), but I do think they're
quite a minor factor in the IT industry today, as yet.


>> >> say COBOL, even if you needed to supply training for it (it's not
>> >> hard to teach COBOL to somebody who knows assembly, anyway -- the
>> >> reverse is harder).
>> > I am on your way; maybe they more probably would be trained for some
>> > "macro language". This is no 'paradigm change'. Most C++ programmers (I
>> > know what I say!) write C programs.
> 
>> History proves you are wrong: in practice, COBOL was what took over
>> the market for business application development, *NOT* macro assembly
>> as you claim would "more probably" happen.  A paradigm change, to be
>> sure, but it's what DID win historically.  By ignoring history, as
>> you're doing, you may not be doomed to repeat it, but you sure do
>> handicap grievously your chances to predict the future.
> 
> Please! I do nor ignore history! I hypothetically predicted something what
> I know did not happen, but what would have been more plausible. It did not

So, you should understand that your criteria of prediction are wrong.

> happen because the number of programmers was small at that time. In fact

Clearly irrelevant.  If programmer productivity was very low, then more
programmers were needed to produce any given total program functionality
than would be with higher productivity (or, with the same number of
programmers much more functionality could be produced).  Thus it made
more sense to train programmers in a higher-productivity technology
than to NOT do so -- and the returns of this strategy were (assuming
enough demand for program functionality, which has always been the
case) PER PROGRAMMER, just as the expenses, thus the total number of
programmers is not relevant in the decision-making process.

> not COBOL took over but RPG, to correct your historical background.

RPG played quite a role, but much more software was in fact developed
in COBOL than in RPG.


> In most companies you do not train programmers but (try to) hire
> experienced staff.

Most companies in Europe do BOTH things.  Laying people off is just
too costly.


>> for many examples.  As the evidence accumulates, Python also starts
>> getting in non-sneakily, by executive decision rather than by the above
>> means -- and keeps proving itself.
> 
> We shall wish that.

It's a fact, not a wish.  Progress may not be as swift as we might
WISH, but it's undeniably happening.


>> And Moore's Law is absolutely crucial to this relentless trend.
> 
> I see no reason why not C# and .NET will be the winners!?

If dotNET should emerge as the favoured platform, Python will then
occupy that niche just as it did for the JVM, and it will offer
just the same productivity advantages wrt C# as it does wrt Java
or C++ and for identical reasons: higher level, and simplicity.

> ...
>> > I should as well like to discuss your aspect of programmer's
> productivity.
>> > This is an important  factor. However all investigations show that
>> > programmer's productivity is an unmeasurable quantity.
>>
>> Once again, you are totally wrong, and moreover, you are wrong in an
>> assertion that you choose to express in the broadest, unqualified terms.
> 
> I was  very well aware of what I was saying ;-)
> 
>> Lutz Prechelt's study, the best-known example of empirical investigation
>> of programmer productivity, show Python and Perl ruling the roost, and
>> grinding e.g. Java into the dust.
> Bu some coincidence I came over that study some months ago (googling for
> some Python matter). I was impressed by the thouroughness of the study but
> not of the result. The study concentrated on coding and some

Of course.  Including maintenance (bugs discovered indicating the
need for corrective maintenance).

> string/pattern matching related task

Of course, any empirical study will concentrate on one task.  I would
not describe that "phone book" example as "string/pattern matching",
exclusively, of course.

> There will be different results when doing embedded applications

Quite possibly, but as I've already explained that's maybe 10% of
software development, so, who cares?  Embedded applications with
hard real time constraints would see an interesting fight between,
say, C and (some constrained subset of) C++ -- all the major
languages used in programming applications, such as Java, Python
and C#, have no guarantees of hard real-time constraints.  And so?
Who cares?

>> This only confirms in a specific case
>> study the well-know results of the Function Point community: the higher
>> level a language, the least amount of code per FP it needs, the higher
>> the productivity in all phases of coding and maintenance.
> 
> Coding is agreed, maintenance is still questionable.

Corrective maintenance costs are well predicted by density of bugs
(which Prechelt did measure) and addition of functionality is strongly
correlated to the cost of the first development (when using decent
methodologies, of course -- "waterfall" ones are just untenable, but
then nobody proposes THOSE as being a better fit for Python!-).

 
>> > Do it with Python then and show it to the world  ;-)
> 
>> It's been done more than once, and it's being done again as we speak,
>> many times over -- again, see the Python Success Stories.  Google is
>> an example that didn't choose to submit all details of its Python
>> success -- but Norvig, Google's search-quality director, has stated
>> unambiguously what a crucial role Python has played and still plays
>> in Google's amazing ongoing success.  The Open-Source Applications
>> Foundation is doing the same with Chandler, a personal information
>> manager application.
> 
> It is not clear whether this is just a niche market like e.g. MATLAB.

Sure -- Google is one niche, Industrial Light and Magic another,
completely unrelated niche, the OSAF a third one, games yet another,
and so on, and so forth.

Breaking up the range of application of a general purpose programming
language such as Python into a zillion niches is just silly, but
feel free to go right ahead -- it's no use, but if it amuses you,
sure, why not.  MATLAB may be one niche of C's application (I do
believe MATLAB is coded in C, but I don't know for sure), but that
says essentially nothing regarding C's (obviously far wider)
range of applicability.


Alex





More information about the Python-list mailing list