Reputation: 13898
I recently discovered that a lot places in our code were doing something like this:
int * int_array = new int[1000];
// ... do things with int_array
delete int_array;
The problem of course is that it should be using the delete []
operator, not the regular delete
operator.
The mystery is: This code has been working for literally years, when compiled by Visual Studio 2003 and 2005 on Windows, and GCC / clang on OS X. Why hasn't this caused things to go terribly wrong before now?
As I understand it, we're telling the compiler to deallocate memory the "wrong" way, and usually if you do that, something terrible happens and your program crashes. Why isn't this happening for us? Do modern compilers automatically "do the right thing" for you, or enough of the right thing that it doesn't matter for basic types, or something else? I can't accept that we simply got lucky, as this code has been in use for years, by thousands of customers under multiple different operating systems.
Note that I'm NOT asking for an excuse to do things wrong, just trying to understand why we aren't in trouble for doing things wrong. :)
Upvotes: 1
Views: 180
Reputation: 16670
Let's run through what happens, step by step (but we'll ignore exceptions):
int *foo = new int[100];
This allocates 100*sizeof(int)
bytes of memory and calls int::int()
100 times, once for each element in the array.
Since int
is a builtin type, it has a trivial constructor (i.e, it does nothing).
Now, how about:
delete foo;
This will call int::~int()
on the address pointed to by foo
, and then delete the memory pointed to by foo
. Again, since int
is a built-in type, it has a trivial destructor.
Compare this to:
delete [] foo;
which will call int::~int()
for each of the items in the array pointed to by foo
and then delete the memory pointed to by foo.
The basic difference here is that elements 1..99 don't get destructed. For int
that's not a problem, and that's probably why your existing code works. For arrays of objects that have a real destructor, you'll see a lot more misbehavior.
P.S. Most implementations will delete the whole block of memory pointed to by foo
if you write delete foo;
even if you allocated it with array-new - but you would be foolish to count on that.
Upvotes: 3
Reputation: 182753
This is the nature of undefined behavior -- it might do exactly what you intended it to do. The problem is, with the next version of the compiler, operating system, library, or CPU ... it might do something completely different.
Most likely, you're getting away with it for two reasons:
int
doesn't have a destructor. So the failure to correctly destroy each element in the array has no consequences.
On this platform, new
and new[]
use the same allocator. So you're not returning a block to the wrong allocator.
Upvotes: 5
Reputation: 185671
The regular delete
operator may actually free all the memory from the array, depending on the allocator, but the real problem is it won't ever run any destructors on the array elements.
Upvotes: 1
Reputation: 66
That will not cause the program to crash, but you will leak 999 * sizeof(int)
bytes of memory each time you do that.
Upvotes: 1
Reputation: 970
This does not work. deleting arrays without []
will leak memory. It'll affect the performance if your application runs for long time. For short lived programs there will be no issue.
Another thing to note is delete
will destroy all the memory which were allocated by new
. If you have allocated an array by new
(not new []
) you can use delete
to destroy it`.
Upvotes: 1