[Pythonmac-SIG] str.decode() behaves differently in 2.5 and 2.6

has hengist.podd at virgin.net
Sun Oct 12 12:42:52 CEST 2008


Hi folks,

Figure I should check here before filing a bug. Anyone understand the  
following discrepancy between 2.5 and 2.6:

$ python2.5
Python 2.5.1 (r251:54863, Jan 17 2008, 19:35:17)
[GCC 4.0.1 (Apple Inc. build 5465)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
 >>> '\x41\x00'.decode('utf16')
u'A'

$ python2.6
Python 2.6 (trunk:66714:66715M, Oct  1 2008, 18:36:04)
[GCC 4.0.1 (Apple Computer, Inc. build 5370)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
 >>> '\x41\x00'.decode('utf16')
u'\u4100'


OS X 10.5.5/i386 using default 2.5.1 Python installation vs. the  
Python 2.6 framework distribution from python.org.

has
-- 
Control AppleScriptable applications from Python, Ruby and ObjC:
http://appscript.sourceforge.net



More information about the Pythonmac-SIG mailing list