> Unicode is not 16-bit any more than ASCII is 8-bit. And you used the > word "encod[e]", which is the standard way to turn Unicode into bytes > anyway. No, a Unicode string is a series of codepoints - it's most > similar to a list of ints than to a stream of bytes. Okay, now you're in blah, blah land. --mark