A note for C programmers

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

Rein Halbersma
Posts: 741
Joined: Tue May 22, 2007 11:13 am

Re: A note for C programmers

Post by Rein Halbersma »

wgarvin wrote:There are surprisingly many programmers out there who write C or C++ code every day, but have never had much exposure to the idea of undefined behavior in these languages, or its consequences. They think of optimizing compilers as something helpful, but to write sound programs it might be more helpful to think of the compiler as your enemy! :lol:

(bob is obviously not one of those newbies, so I couldn't resist poking a bit of fun at him in the previous post. Of course he's indignant that some pointless change by a library implementor broke his decades-old code, and I don't think he would try to defend undefined behavior in most situations, its a rats-nest of potential problems, and makes it very difficult to actually write large programs in C or C++ that are secure and future-proof).

To anyone reading this who hasn't had much exposure to undefined behavior before, I offer the following depressing reading material.

(1) Dangerous Optimizations and the Loss of Causality is a classic presentation about the growing trend of compiler writers exploiting the freedom of undefined behavior. The spec lets them assume you never invoke undefined behavior in your code; as they are increasingly taking advantage of that, more and more "working" old code is put at risk of failing in subtle and unexpected ways. Code that invokes any undefined behavior is basically a potential time bomb that might someday, with the help of an eager compiler writer, decide to blow up your program, or (much worse) introduce subtle security vulnerabilities into it. Every C or C++ programmer ought to know enough about this stuff to avoid falling into the bear pit.

(2) Undefined Behavior: What Happened to My Code? is a nice paper with several frightening examples of the trouble undefined behavior can cause: safety checks optimized out, division-by-zero causing a signal even in code that code explicitly checks for zero before the divide, and other horrors.

(3) Understanding Integer Overflow in C/C++ is another eye-opening paper, I'm sure I've linked it here before. The authors surveyed many real-world programs with their dynamic checker and found lots of examples of undefined behavior bugs.
The C++ Standard Committee has commissioned a Study Group on Undefined and Unspecified Behavior. See http://www.open-std.org/pipermail/ub/20 ... 00005.html for their goals and some interesting links.
mar
Posts: 2554
Joined: Fri Nov 26, 2010 2:00 pm
Location: Czech Republic
Full name: Martin Sedlak

Re: A note for C programmers

Post by mar »

wgarvin wrote:There are surprisingly many programmers out there who write C or C++ code every day, but have never had much exposure to the idea of undefined behavior in these languages, or its consequences. They think of optimizing compilers as something helpful, but to write sound programs it might be more helpful to think of the compiler as your enemy! :lol:
Well... I always thought until recently that C uses 2's complement and that conversion is from signed to usigned simply magically changes the internal representation to unsigned, keeping the bit representation.
I will have to read the specification some day it seems.
I did a lot of stuff like

Code: Select all

if ( (unsigned)x >= y ) return blah;
assert&#40; &#40;unsigned&#41;x < y );
where x is singned and y is unsigned.
Obviously it's undefined behavior. You really shocked me by saying that null ptr doesn't have to be 0. I think the standard allows conversion from constant 0 to null pointer but I'm not sure.
My favorite memset( &x, -1, sizeof(x) ) also relies on 2's complement.
A clever comparision of two values using subtraction (which is how x86 cmp internally works) like return x-y won't work either.
There were times when programming languages served us and allowed us to make our ideas come true. Today I have the feeling that this has changed and we serve the languages instead :)
Seriously, is there any CPU as of today that would use 1's complement or sign-magnitude integer?
2's complement is simply awesome as it allows the CPU to do addition/subtraction without worrying about whether it's a signed or unsigned integer. It's 21st century. Even ancient 6502 used 2's complement to represent relative jumps.
Sure mathematicians may need negative zero but that's what floats are for. There is absolutely no need for negative zero for integers.
I think the standard guys should focus on improving the standard by removing undefined behavior that's already obsolete since many decades instead of adding useless stuff that makes already bloated languages even more bloated.
I only hope that there are/will be compiler warnings that will catch potential problems at compile time, i.e. before they happen.
mvk
Posts: 589
Joined: Tue Jun 04, 2013 10:15 pm

Re: A note for C programmers

Post by mvk »

wgarvin wrote:"Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to having demons fly out of your nose."
Isn't the last option exactly what happened here?

PS: :-)
wgarvin
Posts: 838
Joined: Thu Jul 05, 2007 5:03 pm
Location: British Columbia, Canada

Re: A note for C programmers

Post by wgarvin »

Martin Sedlak wrote:You really shocked me by saying that null ptr doesn't have to be 0. I think the standard allows conversion from constant 0 to null pointer but I'm not sure.
Mission accomplished then! I like to spread the word about undefined behavior, it can be a source of all sorts of really weird bugs. Some behaviors are "unspecified" or "implementation-defined" instead, and its useful to be aware of which code constructs will be portable (according to the standards) and which won't. But "undefined" is dangerous and should be avoided as much as possible.

For pointers, a 0 literal in your source code will be converted into a NULL pointer at compile time, but there's no guarantee that a NULL pointer has all its bits as zeros. All real implementations I've ever seen do it that way, and a lot of existing code assumes it is the case. But there used to be some really odd machines back when C was first invented, and there are still some slightly odd chips today (such as DSPs). Nowadays, 2's complement is pretty ubiquitous, along with 8-bit chars, 32- and 64-bit IEEE-754 floating-point support, and so on. But C was specified loosely to try and accomodate all of the existing hardware back then, and also maybe future hardware if different, better ideas somehow came to prominence in hardware design.

As a result, if you bit-shift an N-bit integer by N or more bits, the result is undefined. I think shifting anything into or out of the sign bit of a signed type is implementation-defined. If you add anything to a signed type that causes overflow, the behavior is undefined. If you negate the most negative value of a signed type, or divide it by -1, the result is undefined. If you cause more than one modification to the same l-value between sequence points, the result is undefined. If you bind a temporary r-value (such as the value returned by a function call) to a non-const temporary, and then try to access it, (I think?) the result is undefined. And so on, ad nauseam.

If you write code that sometimes triggers undefined behavior, the compiler can assume that the undefined thing NEVER happens, and optimize your code into something that only does what you expected in the well-defined cases. If you accidentally write something that ALWAYS has undefined behaviour on a specific control path (or if that code is only reachable by passing through some undefined behavior), the compiler is free to DELETE that code path from the compiled output. It can be an unpleasant surprise for the programmer, especially if he has been writing code in a similar style for many years and the compiler never did that to him before.
wgarvin
Posts: 838
Joined: Thu Jul 05, 2007 5:03 pm
Location: British Columbia, Canada

Re: A note for C programmers

Post by wgarvin »

Ugh, where I wrote "bind a temporary ... to a non-const temporary" I meant to write "to a non-const reference". Oddly enough, binding it to a const reference auto (stack) variable, extends the lifetime of the temporary to the end of the scope enclosing the reference. Which is super convenient sometimes... But its important not to forget that const!
User avatar
lucasart
Posts: 3232
Joined: Mon May 31, 2010 1:29 pm
Full name: lucasart

Re: A note for C programmers

Post by lucasart »

wgarvin wrote:(2) Undefined Behavior: What Happened to My Code? is a nice paper with several frightening examples of the trouble undefined behavior can cause: safety checks optimized out, division-by-zero causing a signal even in code that code explicitly checks for zero before the divide, and other horrors.
Didn't have time to read the other two, but read this one with great interest. I remember encountering the oversized shift on x86 and x86-64. I was surprised by what happenned. In a way, I was lucky that I did not use a PowerPC (which would produce the zero result I naively expected). If I did use a PPC, I would have written code that would not work on x86, and it could have taken me a while to understand why.

Some of the other mistakes are quite obvious. For example: who would, deference a pointer, and check for nullity of the pointer after doing it? Idem for division by zero.

Some of them are tricky, especially signed overflow, which is very nasty to deal with. Let's say you want to add a+b and both are signed int. How do you check for overflow in a portable way, yet not compromising simplicity or efficiency?

There are areas where the C standard should define the expected behaviour. 2's complement and 8-bit char should become standard now. I don't know what kind of weird machines lives in the 1970's, but in the 21st century, hopefully these defective designs have been eradicated.

Ultimately, these mistakes happen, even when the programmer is competent and aware of these issues. Mistakes happen, we're only human. Even more so when you have a large source base developped by many people, not all of which are highly competent. The real problem is: how do you detect and isolate these bugs? Detecting that your program as a whole does not work correctly, either in obvious ways, or in unobvious ways (integration test like node count in the case of chess) may be possible, but drilling down to the offending code can be much harder.

"Demons flying out of your nose" is a very accurate metaphore. It's exactly the kind of problem I've experienced in the past, and took me forever to fix, because it can be so bloody hard to isolate the offending code, and even reproduce the bug deterministically.
Theory and practice sometimes clash. And when that happens, theory loses. Every single time.
mar
Posts: 2554
Joined: Fri Nov 26, 2010 2:00 pm
Location: Czech Republic
Full name: Martin Sedlak

Re: A note for C programmers

Post by mar »

I had (finally) a quick look at the C standard. At least we can rely on non-negative values of signed integers having the same representation as unsigned integers. I think (if I understood correctly) that unsigned integers always wrap the way you would expect, so I was a bit puzzled about this remark in one of the documents you posted:

Code: Select all

For example 0U-1 is well-de&#64257;ned
and evaluates to UINT_MAX, but the actual value of that
constant is implementation de&#64257;ned&#58; it can be relied upon, but
only within the context of a particular compiler and platform.
If the author meant that unsigned int can have different sizes on different platforms, ok. I will look into the standard again to be sure but I think it guarantees that UINT_MAX is 2^n-1 and UINT_MIN is 0. I think the only difference across platforms should be endianess.
Also the standard defines only three possibilities of representing signed integers: 2's complement (I would opt for this as the only valid representation, standard-defined), 1's complement and sign-magnitude.
So at least we can rely on msbit :)
As for the rest, I don't really have problem with signed integer overflows being undefined (or is it implementation-defined?), however I wouldn't mind well defined wraps too, but this would naturally require to adopt 2's complement, killing two birds with one stone.
User avatar
lucasart
Posts: 3232
Joined: Mon May 31, 2010 1:29 pm
Full name: lucasart

Re: A note for C programmers

Post by lucasart »

mar wrote: Also the standard defines only three possibilities of representing signed integers: 2's complement (I would opt for this as the only valid representation, standard-defined), 1's complement and sign-magnitude.
So at least we can rely on msbit :)
As for the rest, I don't really have problem with signed integer overflows being undefined (or is it implementation-defined?), however I wouldn't mind well defined wraps too, but this would naturally require to adopt 2's complement, killing two birds with one stone.
Signed integer overflow being undefined, and the similar problem with pointer/integer arithmetic is a can of worms. The links by Wylie Garvin are an eye opener about this problem (among other).

If only the C standard accepted 2's complement, the whole can could be closed in one go. Suddenly signed integer arithmetic would overflow in a defined way, and these dangerous compiler optimizations would become illegal.

Anyway, these optimizations are useless and toxic. If the programmer writes:

Code: Select all

if &#40;a+b > a+c&#41;
instead of

Code: Select all

if &#40;b > c&#41;
there must be a good reason! We do not want our "friend" the clever compiler to simplify and neglect the overflow.

I compiled DiscoCheck with and without the '-fwrapv' GCC option, and the resulting code is just as fast (and functionally identical). I just don't see how these optimizations taking advantage of undefined signed overflow can be useful in pracitce (assuming well written code).
Theory and practice sometimes clash. And when that happens, theory loses. Every single time.
mar
Posts: 2554
Joined: Fri Nov 26, 2010 2:00 pm
Location: Czech Republic
Full name: Martin Sedlak

Re: A note for C programmers

Post by mar »

lucasart wrote:Signed integer overflow being undefined, and the similar problem with pointer/integer arithmetic is a can of worms. The links by Wylie Garvin are an eye opener about this problem (among other).

If only the C standard accepted 2's complement, the whole can could be closed in one go. Suddenly signed integer arithmetic would overflow in a defined way, and these dangerous compiler optimizations would become illegal.

Anyway, these optimizations are useless and toxic. If the programmer writes:

Code: Select all

if &#40;a+b > a+c&#41;
instead of

Code: Select all

if &#40;b > c&#41;
there must be a good reason! We do not want our "friend" the clever compiler to simplify and neglect the overflow.

I compiled DiscoCheck with and without the '-fwrapv' GCC option, and the resulting code is just as fast (and functionally identical). I just don't see how these optimizations taking advantage of undefined signed overflow can be useful in pracitce (assuming well written code).
I couldn't agree more. I too think that this kind of optimization is absolutely useless.
User avatar
lucasart
Posts: 3232
Joined: Mon May 31, 2010 1:29 pm
Full name: lucasart

Re: A note for C programmers

Post by lucasart »

According to the first paper, the C standard permits even crazier optimizations. Apparently (I was not aware of it and still cannot believe it myself), these two expressions are not equivalent:

Code: Select all

a&#91;x ? 1 &#58; 0&#93;
a&#91;x != 0&#93;
The second one is fine, and no standard compliant compiler will break this code. But the first one is dangerous. The compiler is allowed to assume that x can only be 1 or 0, hence replace it by a[x]. This is a "nice" optimization because it produces branch-less code. The problem is that when x is not 0 or 1, all hell breaks loose. The program could cause segfault (if you're lucky!) or break silently...
Theory and practice sometimes clash. And when that happens, theory loses. Every single time.