Reputation: 7
I've got a type BYTE which is defined in the windows.h header as follows:
typedef unsigned char BYTE;
Then I've got some fields on a database which I get as QString and convert them to a QByteArray.
Currently I have:
QString str = "0x168de78"; //some info from the database
QByteArray tempB = QByteArray::fromHex(str.toAscii().toHex());
BYTE *sequence = (BYTE*)strdup(tempB.data());
In this case the "0x168de78" stands for a sample of the information I am expecting to get from the database.
I need to convert the QString literally to a BYTE so I can later use it.
If I tell the app to give me the values of sequence and tempB.data(), I get something similar to this:
0x867dc8 "0x168de78"
0x867de0 "0x168de78"
0x867df8 "0x168de78"
So tempB.data() is alright, but sequence is not. What am I doing wrong?
Upvotes: 0
Views: 768
Reputation: 1569
QString str = "0x168de78"; //some info from the database
QByteArray tempB = QByteArray::fromHex(str.toAscii());//or
//QByteArray tempB = str.toAscii().toHex();
BYTE *sequence = (BYTE*)strdup(tempB.data());
Update: If u can use STL
unsigned int value;
std::stringstream ss;
ss << std::hex << "0x168de78";
ss >> x;
Upvotes: 1