Reputation: 490
I'd like serialize QVector into char* array. I do this by the following code:
QVector<int> in;
...
QByteArray bytes;
QDataStream stream(&bytes, QIODevice::WriteOnly);
stream << in;
std::copy(bytes.constData(), bytes.constData() + bytes.size(), out);
I guarantee that out
is large enough. Due to the fact that this code is called extremely often I would like to avoid this unnecessary std::copy
operation and make either QByteArray
or QDataStream
work on preallocated user memory pointed by out
. Is that possible? Any bight ideas?
UPDATE: QByteArray::fromRawData() doesn't match the needs cause it does not allow to change char* buffer it was created on, in other words, QByteArray performs deep copy on first modification of such created instance. As they say. This ensures that the raw data array itself will never be modified by QByteArray.
SOLUTION: The solution proposed by @skyhisi does perfectly match my needs. The complete code is the following.
SimpleBuffer.hpp
#pragma once
#include <QtCore/QIODevice>
class SimpleBuffer : public QIODevice {
Q_OBJECT
Q_DISABLE_COPY(SimpleBuffer)
public:
SimpleBuffer(char* const begin, const char* const end) :
_begin(begin),
_end(end){}
virtual bool atEnd() const {
return _end == _begin;
}
virtual bool isSequential() const {
return true;
}
protected:
virtual qint64 readData(char*, qint64) {
return -1;
}
virtual qint64 writeData(const char* const data, const qint64 maxSize) {
const qint64 space = _end - _begin;
const qint64 toWrite = qMin(maxSize, space);
memcpy(_begin, data, size_t(toWrite));
_begin += toWrite;
return toWrite;
}
private:
char* _begin;
const char* const _end;
};
main.cpp
#include "SimpleBuffer.hpp"
#include <QtCore/QVector>
#include <QtCore/QDataStream>
#include <QtCore/QByteArray>
int main(int, char**) {
QVector<int> src;
src << 3 << 7 << 13 << 42 << 100500;
const size_t dataSize = sizeof(quint32) + src.size() * sizeof(int);
char* const data = new char[dataSize];
// prepare stream and write out the src vector
{
SimpleBuffer simpleBuffer(data, data + dataSize);
simpleBuffer.open(QIODevice::WriteOnly);
QDataStream os(&simpleBuffer);
os << src;
}
// read vector with QByteArray
QVector<int> dst;
{
const QByteArray byteArray = QByteArray::fromRawData((char*)data, dataSize);
QDataStream is(byteArray);
is >> dst;
}
delete [] data;
// check we've read exactly what we wrote
Q_ASSERT(src == dst);
return 0;
}
Upvotes: 0
Views: 1500
Reputation: 714
Why not use QBuffer?
QByteArray myBuffer;
myBuffer.reserve(10000); // no re-allocation
QBuffer buffer(&myBuffer);
buffer.open(QIODevice::WriteOnly);
QDataStream out(&buffer);
out << QApplication::palette();
Upvotes: 0
Reputation: 8147
I think you may need to implement a QIODevice
, you could make a very simple sequential device quite easily. Here's one I've quickly thrown together, I haven't checked it works (feel free to get it working and edit the post).
class SimpleBuffer : public QIODevice
{
Q_OBJECT
public:
SimpleBuffer(char* begin, char* end):mBegin(begin),mEnd(end){}
virtual bool atEnd() const {return mEnd == mBegin; }
virtual bool isSequential() const { return true; }
protected:
virtual qint64 readData(char*, qint64) { return -1; }
virtual qint64 writeData(const char* data, qint64 maxSize)
{
const qint64 space = mEnd - mBegin;
const qint64 toWrite = qMin(maxSize, space);
memcpy(mBegin, data, size_t(toWrite));
mBegin += toWrite;
return toWrite;
}
private:
char* mBegin;
char* mEnd;
Q_DISABLE_COPY(SimpleBuffer)
};
Upvotes: 2
Reputation: 134
Maybe fromRawData works:
QByteArray QByteArray::fromRawData ( const char * data, int size ) [static]
Using it something like :
char* out=new char[enoughbytes]; // preallocate at suitable scope
QVector<int> in;
QByteArray ba=QByteArray::fromRawData(out,enoughbytes);
QDataStream stream(&ba,QIODevice::WriteOnly);
stream << in;
Note that QDataStream adds some of it's own data at the start of the data (not much though), so remember to preallocate a bit more for that, as well as for whatever additional data QVector serializes.
Upvotes: 0