[Tutor] Beginners question

DL Neil PyTutor at danceswithmice.info
Wed Apr 1 18:48:46 EDT 2020


On 2/04/20 3:05 AM, Mats Wichmann wrote:
> On 3/31/20 4:37 PM, DL Neil via Tutor wrote:
>> Earlier, I opined that the first code-snippet is "untidy".
> curious: why?

Old f...s like me remember writing monolithic programs and the 
overwhelming chaos of 'spaghetti code'. The greatest influence which 
ushered-in the concepts of modular code and "structured programming" was 
probably Edgar Dijkstra's paper "Goto considered harmful" - which 
spawned a whole genre of xyz-considered harmful wannabes...

Amongst such realisations was the 'virtue' of single-entry, single-exit 
routines/modules, and how such style both simplified reading and saved 
us various 'evils'. Thus, the code-blocks we know-and-love today, eg 
while... and for...


The original construct, as required by the OP, consisted of a while loop 
with a break.

The 'book of words' says: <<<"while" assignment_expression ":" suite>>>. 
Which most of us will understand as: "while condition: do-something".

In the proposed-code/answer, this construct is used/abused, by removing 
the condition, ie to say: "loop forever".

As such, it is necessary to add a second structure, an "if-it's-the-end" 
or a "raise exception", in order to realise some condition under which 
the while-loop should terminate.

So, rather than the while-condition deciding whether to loop (or not), 
we now have, effectively, two conditions for the price of one. No! We 
have the price of two conditions, in order to achieve one loop.


OK, enough theory.

To me the original code looks, if not 'untidy', at least ungainly. 
(clearly a personal opinion/matter of taste)

The question about the "walrus operator", was to establish if we might 
be able to 'simplify' *the logic* back to having a single condition 
(entry/exit), and thus a 'standard', simple, while-loop?

At the same time, the while's condition would become more complex. So, 
at the code-level, the question becomes one of simplicity and 
readability vs compact-concentration and coding-power.


Again, and as you say, largely a matter of taste. I was sufficiently 
intrigued to wonder if folk had any particular opinions, preferably 
based upon experience, one-way or the other...
(particularly as I haven't progressed to v3.8 and such 'delights' as the 
walrus-operator, myself - yet... My wanting to 'stand on the shoulders 
of giants', etc, etc)


>> Has such become liable to a similar judgment in deciding whether, for
>> example; a complex list-comprehension expression should be coded in its
>> compact form, or broken-out into a more readable for-loop?
>> (particularly when dealing with lesser-mortals such as I).
> 
> there's no question there are tradeoffs.  a compact expression may have
> many advantages, for example more of the surrounding code can be seen in
> a viewable size snippet (you can make your terminal emulator or editor
> window able to show 100 lines, but the brain is going to want to "see" a
> much smaller chunk), making it easier to understand the code flow of the
> whole chunk, rather than just thinking about the readability of that one
> statement-or-expanded-out-loop piece.  or, it may make it harder for the
> reader to comprehend because the statement becomes hard to read. "style
> dogma" is a thing, but there are no hard and fast answers in the end.

As I age (and my eyes do too - at about the same rate!) I'm finding that 
my comfortable "chunking"* is decreasing in size, whilst my interest in 
'simplicity' rises in direct proportion. (hey, perhaps I'm just becoming 
more cautious/conservative in my old-age?)

Vis-a-vis dogma: a regular criticism is that I tend to use more classes 
and more methods/functions than (some) others might - but no-one has 
reached the level of criticism which would demand a re-write. (yet?) It 
does however give excuse for another round of jokes at 'grandpa's' 
expense, which amusement keeps us going until someone needs help...

Yesterday, I found myself with a method containing exactly one line of 
code, ie almost more docstring than code. What??? Yes, it did come-about 
due to TDD re-factoring, but when I looked again, I realised that even 
though the various attribute names *are* descriptive, having a 
method-call, and thus 'labeling', actually improved readability/provided 
its own explanation of the functionality - which would otherwise have 
required at least an inline-comment. So, I left it (for now).

The same thing applies to the width of a code-line. Even though I move 
my eyes when reading, and have a daily regimen which includes neck 
exercises, I really dislike scanning from side-to-side to be able to 
read a single line. Accordingly, I stick with the recommended 
79?80-characters per line - and yes, I can spell "Hollerith punched 
card", even bemoan their passing and that of the IBM 029 Card Punch 
machine (but not the LOUD noise it made). Reject cards made excellent 
book-marks - but wait, who uses paper-books these days...?


* "chunking" is the correct (cognitive-)psychological term.
For those to whom it is unfamiliar, it refers to the quantity of data a 
person is comfortable 'holding' at one time. Thus, a raw beginner may 
need to look-up the syntax of a for-statement in order to type the code, 
whereas a programming-master will 'see' or 'think' in terms of the 
entire loop, and more, at once.


>> What more could we do?
>>
>> - users may have become habituated to typing "semaphores" such as "quit"
>> to end input. However, Python allows considerable flexibility over
>> previous generations of languages. Could we instead invite the user to
>> hit Enter (with no data)?
> 
> sure you can, but this is an interface design decision.  A lot of
> developers prefer to have a specific positive value in order to do
> something significant... easy to hit Enter without really thinking about
> it, but typing "quit" requires clear intent.  That's not really a
> "programming" question, we programmers can make it work any way.

Oh? You're baiting me with 'dogma', right?

OK, here's the 'hook, line, and sinker':-

In the abstract, this is correct. A responsibly-minded professional 
would not let it lie though, feeling that we have a responsibility 
towards usability and other aspects that cannot be measured in 
lines-of-code. Perhaps those of us who are not 'just' programmers, spend 
more time in arena where such discussions are more relevant?
(with any due apologies)

Modern (whatever that means) working environments are much more "Agile". 
As such, it is recognised that users should be part of a dev.team. Thus, 
their thoughts, feelings, preferences, etc, are communicated to 
tech.staff; and by the same process, users become better 'educated' in 
what is possible, and informed about the choices which will affect them 
every day (thereafter), and such-like. All to mutual advantage!


Back to the topic: if the application doesn't actually have a practical 
use for the semaphore-word, ie

	future = input( "Do you want to keep your job or quit?" )

users may actually prefer an alternative. Such alternatives may even 
simplify and thus improve the quality of 'our' code...


Web.Refs:
https://homepages.cwi.nl/~storm/teaching/reader/Dijkstra68.pdf
https://docs.python.org/3/reference/compound_stmts.html#the-while-statement
https://en.wikipedia.org/wiki/Unit_record_equipment
-- 
Regards =dn


More information about the Tutor mailing list