Reputation: 4810
I'm having some weird behavior that I am not expecting from my optimizer. Basically it's a case where a variable is overflowing and the logic I'm trying to use to handle that is breaking.
Here is the complete program (weird.cpp
):
#include <stdio.h>
class Example {
public:
void Change() {
change_count_++;
// Check for overflow. If it does, set to 1 which is still valid
if(change_count_ <= 0) {
change_count_ = 1;
}
}
public:
int change_count_ = 0;
};
int main() {
Example example;
printf("Pre: %d\n", example.change_count_);
example.change_count_ = 2147483647; // MAX_INT
printf("Mid: %d\n", example.change_count_);
example.Change();
printf("Pst: %d\n", example.change_count_);
return 0;
}
When building using these commands:
gcc -fPIC -g3 -O1 -g -std=gnu++11 weird.cpp -o optimized
gcc weird.cpp -o normal
The program normal
executes as expected with the following output:
Pre: 0
Mid: 2147483647
Pst: 1
But optimized
gives the following unexpected behavior:
Pre: 0
Mid: 2147483647
Pst: -2147483648
Attaching to a debugger shows me that the increment is done last in the function. Is behavior for overflows undefined in C++? Or is there a different way I should be handling this?
Here is the version of clang I'm using:
tiny.local:~/scratch/weird_inc$ gcc -v
Configured with: --prefix=/Library/Developer/CommandLineTools/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Apple LLVM version 10.0.0 (clang-1000.10.44.4)
Target: x86_64-apple-darwin17.7.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
I tested with gcc 5.4.0
in Ubuntu and it gave me correct answers.
Upvotes: 3
Views: 266
Reputation: 118352
Signed integer overflow is undefined behavior in the current C++ standard.
A compiler is free to assume that undefined behavior never happens, and is not required to compile code that depends on undefined behavior.
This means that the compiler can safely assume that incrementing a positive value greater than 0 will never result in a negative value, or 0.
Your compiler is aggressively optimizing the code. The compiler sees that a variable is being initialized to a positive value, and incremented. Therefore, the compiler assumes that the result cannot be negative or 0, and therefore the comparison does not even get compiled, because it can't possibly ever be true.
Upvotes: 4