UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte

Νίκος nikos.gr33k at gmail.com
Sun Sep 29 04:04:16 EDT 2013


Στις 29/9/2013 11:00 πμ, ο/η Chris Angelico έγραψε:
> On Sun, Sep 29, 2013 at 5:53 PM, Νίκος <nikos.gr33k at gmail.com> wrote:
>> Re: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0:
>> invalid start byte
>
> Something's trying to decode a stream of bytes as UTF-8, and it's not
> UTF-8. Work out what in your code is bytes and what is strings, and do
> your own conversions.
>
> http://www.joelonsoftware.com/articles/Unicode.html
>
> READ IT. Do not write another line of code until you actually
> understand what he's saying there.
>
> ChrisA
>
okey i will, but isnt this just weird:

How come all these daysthe followinf line work as expected:

host = socket.gethostbyaddr( ipval ) [0]

and only just today just happened to output:
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0:
invalid start byte erros?

Nothign got added in my script. This is the only line that tried to 
determine the host.



More information about the Python-list mailing list