Reputation: 469
I'm trying to implement my own version of vector class without using iterators. Here is the parts which may be needed for the question.
template <typename T>
class Vector {
public:
...
~Vector()
{
delete [] m_data;
}
...
void erase(size_t position)
{
if (position >= m_size) {
throw std::out_of_range("erasing an element out of bounds");
}
--m_size;
for (size_t i = position; i < m_size; ++i) {
m_data[i] = m_data[i + 1];
}
m_data[m_size].T::~T();
}
...
private:
T* m_data;
size_t m_size;
...
};
Below is a quote from cplusplus.com for erase
function of std::vector
:
This effectively reduces the vector size by the number of elements removed, calling each element's destructor before.
So I tried to implement the same functionality by calling the destructor of the last duplicate element. The destructor of m_data[position]
is unnecessary as it will be replaced by the next element.
The problem is that the code in the destructor of vector class delete [] m_data
will also call the destructors for each element which will cause double deletion of memory and crash.
Can anyone help to write the correct erase function for my vector class?
Upvotes: 1
Views: 4552
Reputation: 264331
What you want to is make the array of char.
Then you can use placement new to put elements into your vector and when the element is deleted explicitly call the destructor.
void push_back(T const& el)
{
makeSureThereIsSpaceForOneMore();
new (&E[m_size]) T(el);
++m_size;
}
void erase(size_t index)
{
MoveStuffAround();
E(index)->~T();
--m_size;
}
private:
char* m_data;
size_t m_size;
...
T* E(size_t index){return reinterpret_cast<T*>(&m_data[index * sizeof(T)]);}
};
Upvotes: 1
Reputation: 469
I come up to a solution to not use new/delete
operators but use C-style malloc/free
instead. This is because new
and delete
operators are calling constructors and destructors implicitly and you don’t have a control over them. Instead if we use malloc/free
then we can explicitly call constructors/desctuctors for the objects when we need it.
So in this example we can explicitly call the destructor for the last object in erase
function and also for each element in desctuctor of vector class.
Upvotes: 0
Reputation: 101
How about replacing m_data[m_size].T::~T();
with m_data[m_size] = T();
?
Upvotes: 0
Reputation: 299
@Artak don't forget to do your bounds checking the other direction (in erase)
Upvotes: 0
Reputation: 153792
What you are trying doesn't fly that easily! If you want to implement something like std::vector<T>
you need to do the full monty: you need to deal with raw memory and construct/destroy objects explicitly. That is, you need to allocate a sufficient chunk of uninitialized memory, construct/destroy object at their appropriate locations as needed, and eventually release the allocated memory. It is an interesting exercise to do for toy version of std::vector<T>
and then you'll gladly use version shipping with your compiler because it somehow managed to be faster, actually implement all of the functionality, and is reasonably bug free. Of course, if you happen to implement a version of the standard C++ library you'll need to suffer through the entire exercise. The good news is: std::vector<T>
is trivial compared to std::deque<T>
which I'd be prepared to bet big bucks you won't get anywhere nearly as efficient as a standard library version without the use of algorithms (and to get this really efficient you'd need fairly complex version of the algorithms as well; I'm not sure if there are many implementations which actually do specialized versions which are good on std::deque<T>
).
Not using iterators for this is, BTW, just not helpful: algorithms like std::move()
(the version taking iterators as arguments) or std::copy()
(if you don't use C++2011) avoid littering your code with duplicated versions. Having the code in algorithms has the added advantage that they not entirely trivial logic is nicely encapsulated as needed. Putting the repeatedly needed code into algorithms makes the implementation of the containers comparatively simple, giving the implementation a much better chance of being correct. ... not to mention that it is actually viable to implement interesting optimizations as well.
Upvotes: 2