constant in python?

Alex Martelli aleaxit at yahoo.com
Mon Aug 20 05:09:13 EDT 2001


"Bernd Nawothnig" <Bernd.Nawothnig at t-online.de> wrote in message
news:slrn9o0fte.3vvilp7.Bernd.Nawothnig at Asterix.t-online.de...
    ...
> > It seems you've never found yourself in a situation where you *have* to
> > use a given framework (e.g., MFC) because of corporate policy. [...]
    ...
> I don't want to start a religious war but i deeply believe that such
policy
> you have described above will lead into agony on long terms. It means the

I think this is a serious overbid.  When a firm has accumulated a capital
of software projects that interoperate by being all based on the same
framework, it is quite a reasonable policy to say that other projects in
the suite must also use the same framework to ensure continued
interoperation.  Software must be maintained, and it would be very
costly to ask each new software maintainer to be or become familiar
with several different frameworks that do roughly the same job.

> death of every dynamic developement process which needs some freedom of
> choice and the freedom of information. If you have this guaranteed it
would
> be easy for you to accept 'open' restrictions. Open because you can look
> into the source code and hopefully understand the reasons for the
> restrictions - or change it. Nothing is created for eternity :-)

Changing the framework you use, whether you call it "forking" or
otherwise, is one of the costliest errors you can make in production
software development.  The sources of MFC are available to any
purchaser of Microsoft VC++, which comes bundled with them --
when you find and fix outright bugs you're welcome to submit
the patches to the maintainer.  Whether they'll be accepted is of
course another issue, just as it would be for a free or open-source
project.  An *architectural limitation*, a DESIGN bug if you will,
is not normally treated in the same way -- again you can suggest
a deep redesign and refactoring, but your chances to get it
accepted by the framework maintainers are slimmer.

When your changes are rejected, you're at a crossroads.  Either
you fork the project (my advice: *DON'T*!), or you find other
solutions.  Some languages are too rigid to admit of other
solutions -- others give you a chance to survive even in these
dire straits.  I prefer the latter.

> The general goals in designing a new programming language should not
mainly
> be oriented to work in such unfree environments.

Free or unfree has little to do with the case.  Even if you do have
sources, and therefore implicitly the ability to fork, it's WAY
better not to.

> This is my point of view but - as i said above - we should avoid a
religious
> flame war about open source. I'm new to this NG (and new to Python too)
and
> mainly interested in Python related things :-)

Me too, and one of the many Python related things that interest
me is that it is one of the languages that provide a way to solve
such problems without forking, by having the 'restrictions' of
information-hiding &c be 'advisory' (except of course for the
special case of restricted-execution, where introspection &c are
curtailed to keep untrusted code in its place).


> > If you've always had free rein in choosing the frameworks you are to
work
> > with, you're in a very unusual situation in this industry.
>
> Nobody is totally free and it is not necessary to have the freedom to
choose
> the framework by yourself but to have the possibility to change things you
> can't live with. And there must be a good feeling in general.

A language that limits your "possibility to change things" to forking
the framework is simply less practical than one which, besides of
course allowing that, ALSO lets you dynamically work around problems
on an as-needed, when-needed, if-needed, on-the-fly basis.

I'm not saying it's *pleasant* to, e.g., go and optionally switch a
factory function used deep in the bowels of a framework -- but it
IS one more option you have to save your skin in emergencies.
Making the language more complicated, by enforcing visibility
and accessibility restrictions, to TAKE AWAY that option for the
sake of abstract principles would be an absurd policy.


> > -- once you have coded tens of thousands of lines to that framework,
which
> > instantiates that particular concrete class in such-and-such situation,
> > what do you do when you THEN find out that in release N+1 of your
program
> > you absoutely need _another_ class for those generated objects?
>
> Crying: "Shit!" - what else? *gg*
>
> But seriously: First I will have to accept that something went wrong with
my
> ideas. And then i would try to redesign the whole thing avoiding as much
> ugly quickhacks as possible.

Very nice, and very theoretical.  On one side, you have a "ugly quickhack":

    toolbarClass = theFramework.FrameWindow.Toolbar
    class MyToolbar(toolbarClass):
        def minimize(self): pass        # workaround toolbar-minimization
bug
    theFramework.FrameWindow.Toolbar = MyToolbar

    # add a few more lines to make this conditional, potentially
    # restore the original toolbarClass at need, etc, etc.

taking about 2 minutes to design, implement and test for somebody who's
deeply familiar with the framework.  On the other side, you have before
yourself an open-ended rearchitecturing effort: we can't use any more ANY
of the code pathways in the framework that end up instantiating the
hard-coded concrete class theFramework.FrameWindow.Toolbar, and must
duplicate them ALL in order to ensure they instantiate our MyToolbar
instead.  If the framework is in a language which allows the designer
to specify (maybe even as the DEFAULT...) that minimize is not virtual,
the amount of work increases yet more.

All that 'goes wrong', say, is that the framework's designers rigidly
imposed the instantiation of toolbars which inevitably minimize when
their owning frame-window minimizes -- and you need (the UI experts
specify that, and it's non-negotiable) toolbars that don't behave this
way, period.  ONE tiny detail out of many thousands you (and the
framework) are dealing with.  How much price in terms of language
complexity are you willing to pay in order to ensure your situation
is insoluble except by redesigning everything from scratch, so that
you're *forced* to "Do the Right Thing"?


> > You basically have the above choices plus (in theory) that of throwing
> > everything you have away and recoding it all from scratch
>
> 'mercilessly refactoring' was the keyword ...

I have no problem with that, but it doesn't apply to components,
libraries, or whole frameworks you're re-using.  If you touch even
one source-line of that code, you're not re-using any more, you're
forking -- a completely different game.

You can and should refactor the code that your organization
develops and maintains.  If a language doesn't enforce (but
just sets as advisory) the limits of information-hiding &c, then
you can, reasonably easily, make the code you develop and
maintain affect some small but crucial aspect of the working
of the framework/library/component, WITHOUT forcing that
reused code.  If you choose to use a language that does the
enforcement, you don't have that option any more.

> But it is a big difference
> stucking deep into the problem and writing all from scratch knowing
> basically nothing. At the point you described _many_ code fragments are
> written in your head before you start. Not so if you are unexperienced. We
> had this discussion a few days ago. One told me he had lost a floppy disc
> with all his source code a few years ago (no back up - of course :-). He
> needed months to write the code first but only days to do the work again.
> And the second version was better than the lost one.

Yes, a similar thing happened to me back when I was writing my
thesis, over 20 years ago -- a crash in the experimental computer
system being installed at our university took with it all of our
sources (well, we did have print-outs, as we mostly worked on
teletype-like terminals -- but no backups as the tape units were
not installed yet), so we had to rewrite it (again: we had already
developed half the programs in APL, then had to recode them in
Fortran and assembler as the new experimental computer, a VAX,
didn't have an APL implementation).

Yes, the resulting code does get developed faster and with
better quality than the first time around, at least if the first
time is still reasonably fresh in your mind.  It still meant we
graduated in the summer session rather than in the previous
spring session, though -- meaning one more year of tuition
fees the way things worked here at the time, by the way
(we did recoup those, and then some, as the thesis won a
generous cash grant for its quality, admittedly:-).


> Hmm, one should delete all of his source code some times :-)

Maybe if you're 20 to 25.  When over 40, you start to see
there are more things you'd like to code, than time left in
your probable remaining life to code them, so the prospect
of redoing stuff you have already done is less appealing.

And if a limitation is in the framework you're using, and that
you have to keep using, and your chosen language _enforces_
information-hiding measures, your rewrite to workaround that
limitation is going to be an unsatisfactory duplication of much
of your framework -- not much better than a fork, in some
respects.  Why anybody would complicate a language to
ensure such unpleasantness is really a mystery to me.


Alex






More information about the Python-list mailing list