Nick Coghlan wrote: > For binary wrappers around the Windows Unicode APIs, I was thinking > specifically of using UTF-8, since that should be able to encode > anything the Unicode APIs can handle. Why shouldn't the binary interface just expose the raw utf16 as bytes? -- Greg