Reputation: 855
If I have a class like this:
template <class T>
class Array {
private:
T *m_pData;
unsigned int m_nSize;
public:
Array(unsigned int nSize) : m_nSize(nSize),m_pData(nullptr) {
if(m_nSize > 0)
m_pData = new T[m_nSize];
}
~Array() {
if(m_pData != NULL) {
delete [] m_pData;
}
};
If now I create an object of my class like this:
Array<int> miArray(5);
Implementation of the destructor should be fine, but if I create an object like this:
Array<int*> miArray(5);
In the destructor I should delete every object stored in the array to avoid memory leaks.
How can I achieve that?
Thanks
Upvotes: 0
Views: 781
Reputation: 63124
You could use a specialization of your destructor, or tag-dispatch to a deletion function (which would be more flexible). However, this leads to a pretty awkward and brittle design. Supposing you have implemented an equivalent to push_back
, consider these user code snippets:
{
Array<int*> arr;
arr.push_back(new int(42)); // new is on the side of the user
} // ... but delete is not. Weird.
This also leads to a whole class of errors: you sohuld only give new
ed T
s to an Array<T*>
, but there is no security whatsoever against passing in something else:
{
Array<int*> arr;
int i = 42;
arr.push_back(&i); // No diagnostic
arr.push_back(new int[17]); // No diagnostic either
} // Undefined behaviour from calling `delete` on stuff that hasn't been `new`ed
All of this is the reason for the existence of the No raw owning pointers rule: raw pointers should not manage resource lifetime, at all. If I use an Array<int*>
, it should store my pointers and let me use them, but never ever delete
them, because that's trying to manage lifetime.
Instead, if I want an Array
that manages lifetime, I'll use the corresponding smart pointer, with Array<std::unique_ptr<int>>
. This ties in nicely with Array
, requiring it to only call the destructor of the contained objects (which it already does). These destructors (~unique_ptr
, that is) will transitively deallocate the resources as is their job, and all is well.
Take care to initialize your Array
's buffer if you need -- m_pData = new T[m_nSize];
will allocate default-initialized objects. If these objects have no default constructor, their value will be indeterminate. A simple fix is to use new T[m_nSize]{}
, which will perform value-initialization -- that is, initialize arithmetic types to zero, pointers to nullptr
, and compound types recursively.
Implement copy and/or move semantics for your class, see Rule of three/five/zero. As it is now, copying an instance of Array
will lead to two instances thinking they own the same buffer, and undefined behaviour when both try to delete
it.
delete
and delete
check for null, so the if(m_pData != NULL)
in the destructor is redundant.
Good luck :)
Upvotes: 4
Reputation: 42924
I think a good design would be to just wrap raw owning pointers (e.g. int*
in your example) in smart pointer classes, like std::unique_ptr
or std::shared_ptr
, or some other RAII wrapper.
This is what basically happens with std::vector
, where you can have vector<unique_ptr<T>>
or vector<shared_ptr<T>>
if you want to store owning pointers to T in the vector.
Note that storing raw observing pointers in your container is fine (as far as the life-time of the pointed objects is properly taken care of).
Just a couple of additional notes:
Your class should either ban copy via =delete
copy constructor and copy assignment, or properly implement these copy operations. As it is in current state, it's bug-prone if people try to copy construct or copy assign instances of your class.
Array(unsigned int nSize)
constructor should be marked explicit
to avoid implicit conversions from unsigned ints.
Upvotes: 2