ZeroReps
ZeroReps

Reputation: 15

Sending a fixed size UDP packet

I am writing a C++ program to send a UDP packet of 10 bytes. The packet consists of 3 parts:

1 byte - a single number from 0-9

1 byte - a single number from 0-2

8 bytes - a number from 0 to max value of uint64

Lets say for example the numbers are 2, 0 and 14829735431805717965. What I am doing at the moments is sending those numbers as variables to an ostringstream, which i read from this thread should convert the values to ASCII for serialization.

ostringstream str1;
uint8_t byte1 = 2;
uint8_t byte2 = 0;
uint64_t bytes = 14829735431805717965;


str1 << byte1<< byte2 << bytes;
string result = str1.str();
const char *pchar = result.c_str();

The above code demonstrates my process to make it compatible with the sendto() method. However when I do sizeof(str1) I get 152, even thought separately those variables add up to 10 bytes (1+1+8). When I do sizeof(result) and sizeof(pchar) I get 28 and 4 respectively - nothing remotely resembling my 10 bytes. I can only figure my method of doing this is wrong somehow - is the uint64_t bytes each number converted to separate ASCII which adds up to 150 or something? I have looked around for similar problems, but to no avail.

Upvotes: 0

Views: 1043

Answers (1)

Remy Lebeau
Remy Lebeau

Reputation: 595367

However when I do sizeof(str1) I get 152

That is because you are obtaining the byte size of the ostringstream object itself. ostringstream has many data members in it besides just the pointer to the character data. You are looking for the byte size of the actual character data. ostringstream doesn't provide that size directly.

even thought separately those variables add up to 10 bytes (1+1+8).

The raw variables do, yes. But, by using operator<<, you are serializing their values to a string, where each numeric digit becomes its own character in the string, so the total byte size of the final character data will be more like 22, not 10.

When I do sizeof(result) and sizeof(pchar) I get 28 and 4 respectively

Again, because you are obtaining the byte size of the string object itself (which also has other data members besides just the pointer to the character data), and the byte size of a raw pointer itself respectively. You are looking for the byte size of the character data that the string internally points to.

result.c_str() will return a pointer to the actual character data, and result.size() will give you the proper byte size for that data.

I can only figure my method of doing this is wrong somehow

Using operator<< is the completely wrong solution in this situation. To put binary data into an ostream, use its write() method instead, eg:

uint8_t byte1 = 2;
uint8_t byte2 = 0;
uint64_t bytes = 14829735431805717965;

ostringstream str1;
str1.write(reinterpret_cast<char*>(&byte1), sizeof(byte1));
str1.write(reinterpret_cast<char*>(&byte2), sizeof(byte2));
str1.write(reinterpret_cast<char*>(&bytes), sizeof(bytes));

string result = str1.str();
sendto(..., result.c_str(), result.size(), ...);

Personally, I would suggest using a byte-aligned struct instead:

#pragma pack(push, 1)
struct packet {
    uint8_t byte1;
    uint8_t byte2;
    uint64_t bytes;
};
#pragma pack(pop)

packet p;
p.byte1 = 2;
p.byte2 = 0;
p.bytes = 14829735431805717965;

sendto(..., reinterpret_cast<char*>(&packet), sizeof(packet), ...);

Or, worse case, just use a simple byte array:

char packet[10];

uint8_t byte1 = 2;
uint8_t byte2 = 0;
uint64_t bytes = 14829735431805717965;

memcpy(&packet[0], &byte1, sizeof(byte1));
memcpy(&packet[1], &byte2, sizeof(byte2));
memcpy(&packet[2], &bytes, sizeof(bytes));
/* or:
*reinterpret_cast<uint8_t*>(&packet[0]) = byte1;
*reinterpret_cast<uint8_t*>(&packet[1]) = byte2;
*reinterpret_cast<uint64_t*>(&packet[2]) = bytes;
*/

sendto(..., packet, sizeof(packet), ...);

Upvotes: 1

Related Questions