[OT] Re: Why Is Escaping Data Considered So Magical?

Michael Torrie torriem at gmail.com
Wed Jun 30 10:02:47 EDT 2010


On 06/30/2010 03:00 AM, Jorgen Grahn wrote:
> On Wed, 2010-06-30, Michael Torrie wrote:
>> On 06/29/2010 10:17 PM, Michael Torrie wrote:
>>> On 06/29/2010 10:05 PM, Michael Torrie wrote:
>>>> #include <stdio.h>
>>>>
>>>> int main(int argc, char ** argv)
>>>> {
>>>> 	char *buf = malloc(512 * sizeof(char));
>>>> 	const int a = 2, b = 3;
>>>> 	snprintf(&buf, sizeof buf, "%d + %d = %d\n", a, b, a + b);
>>>                        ^^^^^^^^^^
>>> Make that 512*sizeof(buf)
>>
>> Sigh.  Try again.  How about "512 * sizeof(char)" ?  Still doesn't make
>> a different.  The code still crashes because the &buf is incorrect.
> 
> I haven't tried to understand the rest ... but never write
> 'sizeof(char)' unless you might change the type later. 'sizeof(char)'
> is by definition 1 -- even on odd-ball architectures where a char is
> e.g. 16 bits.

You're right.  I normally don't use sizeof(char).  This is obviously a
contrived example; I just wanted to make the example such that there's
no way the original poster could argue that the crash is caused by
something other than &buf.

Then again, it's always a bad idea in C to make assumptions about
anything.  If you're on Windows and want to use the unicode versions of
everything, you'd need to do sizeof().  So using it here would remind
you that when you move to the 16-bit Microsoft unicode versions of
snprintf need to change the sizeof(char) lines as well to sizeof(wchar_t).



More information about the Python-list mailing list