Leading 0's syntax error in datetime.date module (Python 3.6)

Chris Angelico rosuav at gmail.com
Fri May 11 09:24:27 EDT 2018


On Fri, May 11, 2018 at 9:09 PM, bartc <bc at freeuk.com> wrote:
> On 11/05/2018 01:11, Chris Angelico wrote:
>>
>> On Fri, May 11, 2018 at 8:43 AM, bartc <bc at freeuk.com> wrote:
>>>
>>> This is Wrong, and would have been just as obviously wrong in 1989.
>>
>>
>> Having spent many years programming in C and working on Unix, I
>> strongly disagree.
>
>
> Using C is apt to give you a rather warped view of things. Such that
> everything in that language is superior to any other way of doing it.

Obviously, which is why you hear me vehemently arguing against garbage
collection, since in C you have to explicitly free everything up. And
since I've used C and think that everything it does is the best
possible way to do things, I have strongly debated that we should get
rid of strings as first-class objects and just use buffers of
characters, which are kinda just bytes if you squint at them.

> (And actually, because C didn't have binary literals for a long time (I
> think it still doesn't, officially), there has been a lot of discussion in
> comp.lang.c about how they are not really necessary:
>
> : A real programmer can auto-convert from hex
> : It's so easy to knock up some macro that will do it
> : They have /never/ needed binary in decades of programming
>
> And so on. Meanwhile my own /recent/ code includes lines like this:
>
>     when 2x'101'11010'000 then ... # (x64/x87 disassembler)
>
> although I think I still prefer a trailing B, with separator:
>
>     when 101'11010'000'B then ...
>
> Try /that/ in hex /or/ octal.)

I've no idea what this is supposed to mean, or why you have groups of
three, five, and three. Looks like a possible bug to me. I'm sure it
isn't, of course, since you're one of those perfect programmers who
simply doesn't _make_ errors, but if it were my code, I would be
worried that it isn't correct somewhere.

>> This was *not* obviously wrong. It's easy to say
>> "but look at the real world"; but in the 80s and 90s, nobody would
>> have said that it was "obviously wrong" to have the native integer
>> wrap when it goes past a certain number of bits. And in fact, your
>> description of the real world makes it equally obvious that numbers
>> should have a fixed width:
>
>
> Much of the real world /does/ use fixed widths for numbers, like that
> odometer for a start, or most mechanical or electronic devices that need to
> display numbers. And with many such devices, they wrap as well (remember
> tape counters).
>
> Even my tax return has a limit on how big a sum I can enter in the boxes on
> the paper form.
>
> So the concept of fixed upper width, sometimes modulo numbers isn't alien to
> the general public. But leading zeros that completely change the perceived
> value of a number IS.

Cool. So what's the native integer size for the real world? Use that
as your primary data type.

Oh, can't decide how many digits? That's a pity.

>> Octal makes a lot of sense in the right contexts. Allowing octal
>> literals is a Good Thing. And sticking letters into the middle of a
>> number doesn't make that much sense, so the leading-zero notation is a
>> decent choice.
>
> No it isn't. You need something that is much more explicit. I know your C
> loyalty is showing here, but just admit it was a terrible choice in that
> language, even in 1972. And just as bad in 1989.

Go get a time machine. Spend some time in the 1980s. See what kinds of
programming people were doing. HINT: It wasn't web app development.

> Why do language designers perpetuate bad ideas? The reason for designing a
> new language is just so you can get rid of some of them!

Yeah, which is why your personal pet language has approximately one
user. The more things you change when you create a new language, the
more likely that it'll be utterly useless to anyone but yourself.

Consistency is a lot more important than many people give it credit for.

ChrisA



More information about the Python-list mailing list