UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte
Ned Batchelder
ned at nedbatchelder.com
Sun Sep 29 06:53:27 EDT 2013
On 9/29/13 4:04 AM, Νίκος wrote:
> Στις 29/9/2013 11:00 πμ, ο/η Chris Angelico έγραψε:
>> On Sun, Sep 29, 2013 at 5:53 PM, Νίκος <nikos.gr33k at gmail.com> wrote:
>>> Re: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in
>>> position 0:
>>> invalid start byte
>>
>> Something's trying to decode a stream of bytes as UTF-8, and it's not
>> UTF-8. Work out what in your code is bytes and what is strings, and do
>> your own conversions.
>>
>> http://www.joelonsoftware.com/articles/Unicode.html
>>
>> READ IT. Do not write another line of code until you actually
>> understand what he's saying there.
>>
>> ChrisA
>>
> okey i will, but isnt this just weird:
>
> How come all these daysthe followinf line work as expected:
>
> host = socket.gethostbyaddr( ipval ) [0]
>
> and only just today just happened to output:
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0:
> invalid start byte erros?
>
> Nothign got added in my script. This is the only line that tried to
> determine the host.
This is the nature of Unicode pain in Python 2 (Python 3 has a different
kind!). This may help you understand what's going on:
http://nedbatchelder.com/text/unipain.html
--Ned.
More information about the Python-list
mailing list