Reputation: 170509
Recently I ran the following code on ideone.com (gcc-4.3.4)
#include <stddef.h>
#include <stdio.h>
#include <stdlib.h>
#include <new>
using namespace std;
void* operator new( size_t size ) throw(std::bad_alloc)
{
void* ptr = malloc( 2 * 1024 * 1024 * 1024);
printf( "%p\n", ptr );
return ptr;
}
void operator delete( void* ptr )
{
free( ptr );
}
int main()
{
char* ptr = new char;
if( ptr == 0 ) {
printf( "unreachable\n" );
}
delete ptr;
}
and got this output:
(nil)
unreachable
although new
should never return a null pointer and so the caller can count on that and the compiler could have eliminated the ptr == 0
check and treat dependent code as unreachable.
Why would the compiler not eliminate that code? Is it just a missed optimization or is there some other reason for that?
Upvotes: 9
Views: 1821
Reputation: 11
Clang does the optimization you expected:
cccc@~/workspace/tmp$ clang++ --version
Apple clang version 13.1.6 (clang-1316.0.21.2.3)
Target: x86_64-apple-darwin21.4.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
cccc@~/workspace/tmp$ clang++ test.cc -std=c++11
test.cc:10:40: warning: overflow in expression; result is -2147483648 with type 'int' [-Winteger-overflow]
void *ptr = malloc(2 * 1024 * 1024 * 1024);
^
test.cc:15:6: warning: function previously declared with an explicit exception specification redeclared with an implicit exception specification [-Wimplicit-exception-spec-mismatch]
void operator delete(void *ptr) { free(ptr); }
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/c++/v1/new:182:36: note: previous declaration is here
_LIBCPP_OVERRIDABLE_FUNC_VIS void operator delete(void* __p) _NOEXCEPT;
^
2 warnings generated.
cccc@~/workspace/tmp$ ./a.out
0x0
unreachable
cccc@~/workspace/tmp$ clang++ test.cc -std=c++11 -O3
test.cc:10:40: warning: overflow in expression; result is -2147483648 with type 'int' [-Winteger-overflow]
void *ptr = malloc(2 * 1024 * 1024 * 1024);
^
test.cc:15:6: warning: function previously declared with an explicit exception specification redeclared with an implicit exception specification [-Wimplicit-exception-spec-mismatch]
void operator delete(void *ptr) { free(ptr); }
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/c++/v1/new:182:36: note: previous declaration is here
_LIBCPP_OVERRIDABLE_FUNC_VIS void operator delete(void* __p) _NOEXCEPT;
^
2 warnings generated.
cccc@~/workspace/tmp$ ./a.out
cccc@~/workspace/tmp$
Upvotes: 1
Reputation: 15656
I checked the assembly produced with g++ -O3 -S and the standard new, gcc (4.4.5) does not remove the if(ptr == 0)
.
It seems that gcc does not have compiler flags or function attributes to optimize NULL checks either.
So it appears that gcc does not support this kind of optimization at the current time.
Upvotes: 0
Reputation: 179991
C++11 is clear on the issue:
void* operator new(std::size_t size);
: ... 3 Required behavior: Return a non-null pointer to suitably aligned storage (3.7.4), or else throw a bad_alloc
exception. This requirement is binding on a replacement version of this function.
You hit Undefined Behavior.
[edit] Now, why would this impede optimization? Compiler vendors tend to spend their time dreaming up optimizations for code patterns that are commonly used. There's usually little benefit for them to optimize for faster Undefined Behavior. (Some UB may be well-defined on that particular compiler and still be optimized, but the above example likely wouldn't be).
Upvotes: 5
Reputation: 477378
I think this is very simple and you have two fundamentally different things confused:
malloc() can return anything, in particular zero.
the global C++ allocation function void * operator new(size_t) throw(std::bad_alloc)
is required by the standard to either return a pointer to the required amount of storage (+ suitably aligned), or otherwise exit through an exception.
If you want to replace the global allocation function, it is your responsibility to provide a replacement that abides by the rules of the standard. The simplest version looks like this:
void * operator new(size_t n) throw(std::bad_alloc) {
void * const p = std::malloc(n);
if (p == NULL) throw std::bad_alloc();
return p;
}
Any serious implementation should actually contain a loop to call the registered new-handler until the allocation succeeds, and only throw once there are no more new-handlers.
The program that you wrote is simply ill-formed.
Digression: Why is this new
defined that way? Consider the standard allocation sequence when you say T * p = ::new T();
. It is equivalent to this:
void * addr = ::operator new(sizeof(T)); // allocation
T * p = ::new (addr) T(); // construction
If the second line throws (i.e. construction fails), the memory is deallocated with the corresponding deallocation function. If the first call fails, though, then the execution must never reach the second line! The only way to achieve this is by exiting through an exception. (The no-throw versions of the allocation functions are only for manual use where the user code can inspect the result of the allocator before proceeding to construction.)
Upvotes: 7
Reputation: 1
There are more than one operator new
; See here. And you did not declare your one as possiblity throwing an exception. So the compiler should not infer it does never return a null pointer.
I don't know very well the latest C++11 standard, but I guess that it is only the standard defined operator new
(the one throwing exception) which is supposed to return a non-nil pointer, not any user defined ones.
And in the current GCC trunk, file libstdc++-v3/libsupc++/new
don't seem to contain any specific attribute telling GCC that nil is never returned... even if I believe it is undefined behavior to get nil with a throwing new.
Upvotes: 2
Reputation: 42825
I think you're expecting way too much of the optimizer. By the time the optimizer gets to this code, it considers new char
to be just another function call whose return value is stored on the stack. So it doesn't see the if
condition as deserving special treatment.
This is probably triggered by the fact that you overrode operator new
, and it's beyond the optimizer's pay grade to look in there, see you called malloc
, which can return NULL
, and decide that this overridden version won't return NULL
. malloc
looks like Just Another Function Call. Who knows? You might be linking in your own version of that, too.
There are a couple other examples of overridden operators changing their behavior in C++: operator &&
, operator ||
, and operator ,
. Each of these has a special behavior when not overridden, but behave like standard operators when overridden. For example, operator &&
will not even compute its right hand side at all if the left hand side evaluates as false
. However, if overridden, both sides of the operator &&
are computed before passing them to operator &&
; the short-circuit feature goes away completely. (This is done to support using operator overloading to define mini-languages in C++; for one example of this, see the Boost Spirit library.)
Upvotes: 4
Reputation: 300059
Why should the compiler do so ?
With an opaque implementation of new
it's impossible to know whether the implementation is correct or not. Yours is non-standard, so you are lucky that it did check after all.
Upvotes: 3
Reputation: 64253
although new should never return a null pointer
It shouldn't in normal operation. But how about a crazy situation when someone plugs out the memory, or it simply died, or it just gets full?
Why would the compiler not eliminate that code? Is it just a missed optimization or is there some other reason for that?
Because new can fail. If you use no-throw version, it can return NULL (or nulptr in c++11).
Upvotes: 0