declaration of variables?

Steven Taschuk staschuk at telusplanet.net
Sun Feb 16 19:38:56 EST 2003


[I've reflowed some of the quoted text below.  Please tell your
software to wrap your lines somewhere sensible, say, at 68
characters.]

Quoth André Jonsson:
> Jp Calderone wrote:
  [...]
> >   A good set of unit tests will catch this error any time it comes up. A
> > good set of unit tests will also catch many *other* errors that variable
> > declaration won't.  If you don't have good unit tests, it might be time to
> > consider writing some.
> 
> I haven't done many no, so that's probably the case.  I still
> think that receiving information about an error sooner is better
> than later.

Earlier is indeed better than later.  However, the loop
	while program not compiling:
		debug
is pretty much the same as
	while tests failing:
		debug
Insofar as early detection of bugs is concerned, it doesn't much
matter whether it's your compiler or your test suite which tells
you about a problem.

> >   As a Python programmer of 5 years, I can give my personal experience:
> > typos in names haven't cost me anything more than a dozen lost seconds once
> > every few weeks.  On the other hand, having to type declarations for every
> > variable I use would cost me at least a hundred times more time, on a
> > regular basis.
> 
> Funny, I think it's the other way around.

<boggle> Your experience is that debugging typos in names would
cost you a hundred times more time than typing the declarations?

I feel safe in inferring that you do not practice test-driven
development, and perhaps do not follow any automated testing
discipline at all.  I strongly recommend you look into JUnit or
PyUnit or the like.  (See below on testing a list sorter.)

My experience is that, once you get used to writing thorough tests
for your code as a matter of course, variable declarations will
stop feeling like a language feature that helps you write correct
code and start feeling like a stupid little dance you have to do
to keep the compiler happy.

(I went through exactly this transition.  Java was once my
language of choice; now things like single inheritance, which are
supposed to simplify the language and make it easier to write
correct code, seem like pointless restrictions.  "I don't want you
to fall over," says Java, "use these crutches." "But I have these
shiny bionic legs!"  "That is of no concern to me.")

  [...]
> Hmm, how about this example: if a given function sorts some list,
> if there is an error in there that doesn't cause a run-time error.
> How is it possible to detect this without having to write the
> whole routine once more and compare the results? (which, of
> course, is an error-prone process).

Well, for a start:
	L = [2, 8, 3, 1, 2, 0]
	L.sort()
	assert L == [0, 1, 2, 2, 3, 8]
This won't catch all bugs, of course.  But if it misses a misspelt
identifer bug, either you're not using that variable anyway, or
there's a whole branch of computation which uses that variable and
is not being tested... so write a second test which does exercise
that branch.

Two other options applicable in this case (though not suitable for
all testing problems): First, write a simple, obviously-correct
sorter (using bubble sort, say), and compare its output to your
fancy algorithm's output for a large, automatically generated set
of lists.  Second, prove your algorithm correct.

-- 
Steven Taschuk           | Kinsley's Law: "Every public
staschuk at telusplanet.net |  frenzy produces legislation
                         |  purporting to address it."





More information about the Python-list mailing list