unicode encoding usablilty problem

Fredrik Lundh fredrik at pythonware.com
Fri Feb 18 13:24:10 EST 2005


anonymous coward <aurora00 at gmail.com> wrote:

> This brings up another issue. Most references and books focus exclusive on  entering unicode 
> literal and using the encode/decode methods. The fallacy  is that string is such a basic data type 
> use throughout the program, you  really don't want to make a individual decision everytime when 
> you use  string (and take a penalty for any negligence). The Java has a much more  usable model 
> with unicode used internally and encoding/decoding decision  only need twice when dealing with 
> input and output.

that's how you should do things in Python too, of course.  a unicode string
uses unicode internally. decode on the way in, encode on the way out, and
things just work.

the fact that you can mess things up by mixing unicode strings with binary
strings doesn't mean that you have to mix unicode strings with binary strings
in your program.

> Even for those who choose to use unicode, it is almost impossible  to ensure their program work 
> correctly.

well, if you use unicode the way it was intended to, it just works.

</F> 






More information about the Python-list mailing list