subsetting and .NET (was Re: Python and Ruby: a comparison)

Alex Martelli aleax at aleax.it
Thu Jan 3 05:01:44 EST 2002


"Edward Diener" <eldiener at earthlink.net> wrote in message
news:3C33856F.3050809 at earthlink.net...
    ...
> > Subsetting becomes even more prevalent as soon as you accept some
> > "standard library" features as part of a language.  If you never
    ...
> I make the distinction between libraries and language features. There is
> a totally different mindset to not using libraries than to not using
> features, even though both "not using" parts may be a logical result of

But, not using *PART* of a module or package, or not using a subset of
some object's methods, is very similar, mindset-wise, to not using part
of a language's features.

E.g., consider:

A) "I don't ever write 'x+=23' -- I grew up with 'x = x + 23' and that's
good enough for me"

B) "I don't ever write 'd.setdefault(key, def)' -- I grew up with
'if not d.has_key(key): d[key] = def' and that's good enough for me"

C) "I don't ever write '[x for x in l if isok(x)]' -- I grew up with
'filter(isok, l)' and that's good enough for me"

(and so on, and so forth).  Some of these uses and not-uses affect
stuff that is documented in Python's Library manual, others don't
(affecting only stuff that's documented in the Language manual), but
there's really very little "mindset difference" between them.


> inclination to learn to use it, or 3) Yes, I know all about it but I
> have always done things this way because very few implementations
> support this new feature so I haven't bothered to learn it.

A very good pragmatical reason to eschew "this new feature", if true
and if you do need to worry about several implementations.  Among the
applications I maintained, it was 1994 before I could feel confident
of going for ISO-C function prototypes throughout the codebase (and
even then, it was only possible because at that point in time I felt
I could, worst case, rely on gcc being available in all cases of need).
The Python codebase only took the same step in version 2.0, i.e.,
in the year 2000, having to worry about a wider porting-targets range.


This doesn't really affect the .NET situation, of course.  E.g.,
the "new features" in Haskell for .NET are the .NET-specific parts
(equivalent to a traditional "foreign function interface" part).

Worrying that "Haskell for .NET" users won't use "Haskell specific"
parts (such as lazy evaluation or typeclasses) is IMHO misplaced: it
would be a terrible chore to carefully avoid them (just try to avoid
lazy evaluation systematically in Haskell...!!!), and people with no
liking or need for them would presumably choose a language of wider
availability such as, say, C#.  All in all, I consider your worries
that .NET-compatible versions of such languages will worsen the issue
of subsetting for the languages in question to be unfounded.


> The language which you ( and so many others ) identify as C++, with its
> ability to "play havoc with pointers and memory management" has been
> superceded in the past 7 years by a language whose modern constructs
> make it nearly impossible to have these problems. The fact that many

I *beg* your pardon: I'm a Brainbench MVP for C++, "C++ Guru" is one
of the main roles for which my current employer pays my not inconsiderable
salary (side by side with similar guruhood for other technologies such
as COM, network protocols, and Win32 APIs), and I'm perfectly aware of
ISO C++ strengths (and weaknesses -- mostly, subtlety and complication).

It's simply untrue that ISO C++'s "modern constructs" ``make it nearly
impossible to have [memory management] problems'', because those constructs
need to keep working within an overall conceptual framework where
memory management is among an application programmer's responsibilities
and "object ownership" is the fundamental conceptual tool for it.  And
it's just not an appropriate tool for most application needs (garbage
collection IS).

> so-called C++ programmers do not want to use these very simple and
> elegant constructs, in favor of more error-prone techniques still
> supported by the lnaguage for backward compatibility, is not proof of
> these problems still existing for practiced programmers in the language.

If somebody's doing "greenfield development" in C++, they're lucky (if,
in my well-informed opinion, misguided in their language choice).  Most
commonly, there are legacy subsystems and legacy interfaces to be kept,
and those won't give you the freedom to use currently-optimal approaches
(typically template-based ones) everywhere.

> Java's doing away with those problems is a red herring, since these
> problems no longer exist for professional C++ programmers.

I play the guru and language advisor for almost 200 professional C++
programmers at my current employer, and I maintain your assertion is
totally false.


> Nor am I sold on productivity enhancement when moving to Java simply
> because you claim it. And I am a Java programmer also. Productivity is

I'm not "a Java programmer".  I just conducted pilot projects back when
Java was launched, to help my employer make strategic technology choices
on the basis of something better than sheer hype (technology trailblazing
being among my job responsibilities).  I measured things carefully and
noticed a productivity increase of between 10% and 20% for such tasks as
coding in Java a number of existing proprietary net protocols, compared
to coding them in ISO C++ -- this was with substantial C++ experience vs
no substantial Java experience yet (but some features of ISO C++ were
then new enough that the experience with them was still a bit thin,
albeit richer than that with Java).  The gain came mostly from automatic
memory management, offset, however, by the huge productivity loss due
to the lack of templates (generics) -- a short while later I tried a
Java dialect called "Pizza", which DID offer generics, but it was deemed
an experimental project, not suitable for production use (pity).

On the basis of these and other measurements, my employer decided that
the costs and risks of technology migration were not warranted; Java was
used only for certain well-defined projects where customers demanded it,
and otherwise we stuck with C++.  I still strongly believe that going to
a higher-level language (such as Python) for most of our development
would be a huge productivity win, but I concur that languages such as
Java or C#, with their maybe-20% gains, are not worth the costs and risks
(particularly for migrations; greenfield projects may be different) --
when generics are in, things may get better, but for example .NET generics
at this time are *not* scheduled to cover one of the most productive
patterns, Coplien's "Pattern Without a Name" (class C<T>: public T {...});
not sure if "Generic Java" does accept Coplien's PWAN.

> not just the spewing forth of the maximum lines of code in the minimum
> time. But I am sure you know that.

"Spewing forth" *working* code (that passes a thorough test suite and
thus demonstrably delivers a given functionality F, measured, say, in
function points), is indeed the key measure in coding productivity.  The
ease of future code modification and refactoring is of course also
important, but, if the test suites are really good and thorough, the
correlation with ease of initially passing the test suites is high.


> I am guessing that the "generic programming" of Java and C# will be
> created to allow flexibly specified algorithms to operate against any
> "collection" or parts of a "collection" of objects.

Collections are important, but far from exhaustive in terms of
generics' usefulness.  E.g., Coplien's PWAN has little to do with
collections, yet it's highly pervasive in well-designed generics code.


> > Much as I might prefer "C# with templates" or whatever, therefore, I'm
> > quite liable to choose "Managed C++" today if tasked to develop some
    ...
> Save me ! No, I will be using C# ( aka Microsoft's version of Java ) if
> I do .NET and not some managed abortion.

I wonder if you do fully appreciate the power of signature based
polymorphism (generics).  Many don't.


> Being "sensible" in programming is not my inclination. I prefer
> creativity at its expense. That is why I heavily prefer Python over
> sensible and pragmatic Perl or safe and limited Javascript ( or
BScript  ).

I find Python highly sensible and highly pragmatic, as (allegedly) befits
the Dutch national character.  But then, I'm not a highly creative person
(much like Carver Mead, according to his interview to the Economist
last September -- "He says he has never had an original idea in his
life", and I feel quite similarly -- cfr
http://www.economist.com/displayStory.cfm?Story_ID=779543).


> I hope you are right, but I anticipate too many full-blooded languages
> becoming watered down and weak by bathing in the .NET stream. Have I not
> already seen posts on this NG about Python .NET ?

Yes, and?  Where's the "watered down and weak" effect?


Alex






More information about the Python-list mailing list