Python 3.2 has some deadly infection

Rustom Mody rustompmody at gmail.com
Fri Jun 6 13:04:39 EDT 2014


On Friday, June 6, 2014 10:18:41 PM UTC+5:30, Chris Angelico wrote:
> On Sat, Jun 7, 2014 at 2:21 AM, Rustom Mody  wrote:
> > Combine that with Chris':
> >> Yes and no. "ASCII" means two things: Firstly, it's a mapping from the
> >> letter A to the number 65, from the exclamation mark to 33, from the
> >> backslash to 92, and so on. And secondly, it's an encoding of those
> >> numbers into the lowest seven bits of a byte, with the high byte left
> >> clear. Between those two, you get a means of representing the letter
> >> 'A' as the byte 0x41, and one of them is an encoding.
> > and the situation appears quite the opposite of Ethan's description:
> > In the 'old world' ASCII was both mapping and encoding and so there was
> > never a justification to distinguish encoding from codepoint.
> > It is unicode that demands these distinctions.
> > If we could magically go to a world where the number of bits in a byte was 32
> > all this headache would go away. [Actually just 21 is enough!]

> An ASCII mentality lets you be sloppy. That doesn't mean the
> distinction doesn't exist. When I first started programming in C, int
> was *always* 16 bits long and *always* little-endian (because I used
> only one compiler). I could pretend that those bits in memory actually
> were that integer, that there were no other ways that integer could be
> encoded. That doesn't mean that encodings weren't important. And as
> soon as I started working on a 32-bit OS/2 system, and my ints became
> bigger, I had to concern myself with that. Even more so when I got
> into networking, and byte order became important to me. And of course,
> these days I work with integers that are encoded in all sorts of
> different ways (a Python integer isn't just a puddle of bytes in
> memory), and I generally let someone else take care of the details,
> but the encodings are still there.

> ASCII was once your one companion, it was all that mattered. ASCII was
> once a friendly encoding, then your world was shattered. Wishing it
> were somehow here again, wishing it were somehow near... sometimes it
> seemed, if you just dreamed, somehow it would be here! Wishing you
> could use just bytes again, knowing that you never would... dreaming
> of it won't help you to do all that you dream you could!

> It's time to stop chasing the phantom and start living in the Raoul
> world... err, the real world. :)

I thought that "If only bytes were 21+ bits wide" would sound sufficiently 
nonsensical, that I did not need to explicitly qualify it as a utopian dream!



More information about the Python-list mailing list