Could Python supplant Java?

James J. Besemer jb at cascade-sys.com
Fri Aug 23 15:11:49 EDT 2002


brueckd at tbye.com wrote:

> But the whole point of your argument was that _size_ was the biggest
> factor, so if one language consistently requires fewer lines of code then
> it is possibly a _huge_ factor in determining final cost.

> Ok, you've stated your position but omitted your reasoning. Do you no
> longer agree with your earlier position that size is the biggest factor
> affecting cost?

The model I was referring (COCOMO) to says that code size affects the cost of the
overall project.  Size affects the project more than any other factor except for
personnel and it affects project costs in a decidedly non-linear fashion
(polynomial or exponential, I don't recall).

In this model there is a separate 'knob' for language power.  Over lots of
projects, with lots of languages, the 'language' knob had much less effect on
project cost than mere code size (holding language constant).  IIRC language also
affected cost according to code size only the contribution (savings) was "linear"
on code size.

With the proper setting for language, the model would predict different costs for
the same number of lines in two different languages such as C and Python.

I was saying that on large projects the linear factors become insignificant
because the non-linear factors overwhelm the computation.  That is:

    k1 * LOC + k2* exp( LOC ) + otherStuff( LOC )

    is similar to k3 * exp( LOC ) + otherStuff( LOC )

        for very large values of LOC.

This holds even if the language efficiency (k1) is very small (high efficiency)
because the overall size factor (k2) is known to be the largest one.  [These
aren't the real COCOMO equations -- only a grossly oversimplified example to
explain my point.]

Intuitively, Python may help but on really large projects you need to add extra
computers, managers, secretaries, vacation time, and support personnel (greatly
adding to cost) but the use of Python vs. C doesn't affect those portions of the
cost model at all.

Sorry if I wasn't clear.  This is at the limits of my understanding of Software
Economics.  It's not my model and the book was published before Python was
invented.  I believe it's Serious Wisdom that still applies but you're free to
blithely reject it all as bullshit.

Perhaps the mismatch is the model is true but some of its lessons don't apply to
projects as small as the ones typically encountered by this group.

Buy the book, read and understand it and you'll know more than me:


http://www.amazon.com/exec/obidos/ASIN/0138221227/qid=1030127217/sr=2-1/ref=sr_2_1/102-5314364-9351338

> One way to look at it is as a paradigm shift. Another is to see each
> transition as one of letting the compiler and language take care of more
> of the mundane details so that more and more of your focus is on solving
> the actual development problem. For example, in assembly language, how
> much of the actual coding time is spent on problems that are in no way
> specific to the task at hand? Or put another way: for each unit of coding
> effort (X), what distance (Y) did you move forward towards the goal of
> completion? In assembly the ratio X:Y is huge, it's smaller in C, and
> smaller in Python.  I'll transition from Python when I find something
> that, all else being the same, lowers it further by a significant amount.

I agree with everything you say here.

One also could argue that C/C++ to Python is a paradigm shift also, in that you
are relieved from the burden of managing storage (a sometimes very costly mundane
detail).  I've said before that GC is a key strength of Python (and Perl, Lisp,
etc.) that distinguish it (them) from C++.

> Within any particular language, productivity gains are had by adding
> "stuff" (macros, libraries, etc.) to make the language more like one with
> a smaller X:Y ratio, but at some point it just becomes more cost-effective
> to move on to the next language.

Agreed.

> No, because I already cited from my own experience a mid-range program
> (not quite 100k lines) that gave 8X difference in lines of code.

You just don't want to admit that you can't improve my example to 1/8 the size of
the C++.  ;o)

Seriously, your and others' anecdotal data has given me pause.

> This morning I found another example:  [...]
> Here's the line count with the ratio of lines to Python lines:
>
> C++: 786 (8.3x)
> Java: 466 (4.9x)
> Python: 94 (1.0x)
>
> The Java one was implemented first, followed by Python, followed by C++,
> although they were all implemented independently of each other.

I have to ask, did the Python version by any chance make use of some large XML
library not available to C++ or Java?  If so that would not necessarily undermine
your argument from a practical standpoint but it also would help explain some of
the claims I have trouble believing.

I pointed you all to the "No Silver Bullet" article.  It was profound wisdom for
its time and I think it still applies today.  I'm particularly sensitive to the
10X claims for this reason.  I've heard the claims so many times before and since
by zealots and salesmen and they've never once been true.  It's furthermore
exactly the sort of claim you'd expect to hear on a list of dedicated Python
hackers.  So I remain skeptical.

But I've had my say and I'll shuddup about it for now and won't bring the topic
up ever again

I still would like to hear about the 94 line Python program, however...

Regards

--jb

--
James J. Besemer  503-280-0838 voice
http://cascade-sys.com  503-280-0375 fax
mailto:jb at cascade-sys.com






More information about the Python-list mailing list