GCC and Unicode (OT)

Samuel A. Falvo II kc5tja at garnet.armored.net
Sun Jan 2 22:25:46 EST 2000


Please indulge my off-topic request for a brief moment -- it's nothing at
all Python related, but I'm not having much luck with this particular problem.
Does anyone know of any way to make GCC produce 16-bit characters when
encoding a Unicode string?  That is, if I have:

	wchar_t *myText = L"ABCD";
	
I want myText to point to the following:

	0x0041 0x0042 0x0043 0x0044

instead of:

	0x00000041 0x00000042 0x00000043 0x00000044

I'm using GCC under the Linux operating system.

Thanks for any tips you folks have, and I apologize for the off-topic nature
of this message.

-- 
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA



More information about the Python-list mailing list