UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte

Lele Gaifax lele at metapensiero.it
Fri Jul 5 07:16:55 EDT 2013


Νίκος Gr33k <nikos at superhost.gr> writes:

> Στις 5/7/2013 1:59 μμ, ο/η Lele Gaifax έγραψε:
>> Νίκος Gr33k <nikos at superhost.gr> writes:
>>
>>> UnicodeDecodeError('utf-8', b'\xb6\xe3\xed\xf9\xf3
>>>
>>> but what string does it try to decode and jeeps failing?
>>
>> Reasonably it's the second one, as the first clearly seems the tag of
>> the decoder that tried to translate it to Unicode.
>
> 2nd one of what? 2nd byte in order?

You asked “what string” (although you probably meant “which string”):

UnicodeDecodeError('utf-8', b'\xb6\xe3\xed\xf9\xf3

first string-------^^^^^^^
second string---------------^^^^^^^^^^^^^^^^^^^^^^

ciao, lele.
-- 
nickname: Lele Gaifax | Quando vivrò di quello che ho pensato ieri
real: Emanuele Gaifas | comincerò ad aver paura di chi mi copia.
lele at metapensiero.it  |                 -- Fortunato Depero, 1929.




More information about the Python-list mailing list