Reputation: 574
i am currently trying to get a C++ Server running which can communicate with a WebSocket. The HandShake consists of a couple of steps and I have no success with the last one.
The first Step is to generate a SHA1 encoded string and I successfully obtained the right hex string. (Example http://en.wikipedia.org/wiki/WebSocket & https://www.rfc-editor.org/rfc/rfc6455).
My output is in both cases the same as stated in the documentation:
Wikipedia: 1d 29 ab 73 4b 0c 95 85 24 00 69 a6 e4 e3 e9 1b 61 da 19 69
My Server: 1d 29 ab 73 4b 0c 95 85 24 00 69 a6 e4 e3 e9 1b 61 da 19 69
IETF Docu: b3 7a 4f 2c c0 62 4f 16 90 f6 46 06 cf 38 59 45 b2 be c4 ea
My Server: b3 7a 4f 2c c0 62 4f 16 90 f6 46 06 cf 38 59 45 b2 be c4 ea
So this is right. When i now do the Base64 encoding i come to the following results:
Wikipedia: HSmrc0sMlYUkAGmm5OPpG2HaGWk=
My Server: MWQyOWFiNzM0YjBjOTU4NTI0MDA2OWE2ZTRlM2U5MWI2MWRhMTk2OQ==
IETF Docu: s3pPLMBiTxaQ9kYGzzhZRbK+xOo=
My Server: YjM3YTRmMmNjMDYyNGYxNjkwZjY0NjA2Y2YzODU5NDViMmJlYzRlYQ==
And this a completely different. I confirmed that my Base64 algorithm works wtih certain online converters and they all produced the output my server did. So the problem is the input format. I found a forum entry in a javascript forum where one had the same problem and the answer was that , instead of passing the 40 character hex string, we should pass the 20 character binary representation.
I know that openssl SHA1 returns a binary representation but i can't use the library for certain reasons. The SHA1 library i use, puts the encoded output in an int array. The output looks like this (IETF example):
result[0] = 3011137324
result[1] = 3227668246
result[2] = 2432058886
result[3] = 3476576581
result[4] = 2998846698
I convert this than to hex like this:
std::ostringstream oss;
oss << std::setfill('0');
for (int i = 0; i < 5; ++i) {
oss << std::setw(8) << std::hex << result[i];
}
Now the big question. How can i convert my hex string to binary?
Thanks a lot in advance . Markus
EDIT
If someone is interested in the code: https://github.com/MarkusPfundstein/C---Websocket-Server
Upvotes: 2
Views: 2797
Reputation: 443
I was testing websocket in c and found that bytes were in the wrong order. Adapting the order (in reverse) solved my problem with base64 encoding, resulting in correct accepted key string:
unsigned char byteResult [20];
for(i = 0; i < 5; i++) {
byteResult[(i * 4) + 3] = sha.result[i] & 0x000000ff;
byteResult[(i * 4) + 2] = (sha.result[i] & 0x0000ff00) >> 8;
byteResult[(i * 4) + 1] = (sha.result[i] & 0x00ff0000) >> 16;
byteResult[(i * 4) + 0] = (sha.result[i] & 0xff000000) >> 24;
}
Upvotes: 2
Reputation: 294377
On a slightly related note (I see you already discovered the EVP BIO base64 way...):
result[0] = 3011137324
...
oss << std::setw(8) << std::hex << result[i];
If I understand correctly this results in output b37a4f2c
, which is your IETF Docu example. Be very careful here, because you are threading on the open water dangers of platform specific endianess. 0n3011137324 is indeed 0xb37a4f2c but only on little endian machines, like Intel architectures. You would probably be better off to reinterpret cast the &result[0]
to unsigned char*
and then handle that as an array of bytes, instead of an array of (unsigned) ints.
Upvotes: 1
Reputation: 3580
Most Baser64 encoder expect a byte array/stream of binary data. You want to split up your ints into bytes, using bit masks and logic shifts. On 32 bit systems each int contains 4 bytes, you can extract them as follows:
for(i = 0; i < 5; i++) {
byteResult[(i * 4) + 0] = result[i] & 0x000000ff;
byteResult[(i * 4) + 1] = (result[i] & 0x0000ff00) >> 8;
byteResult[(i * 4) + 2] = (result[i] & 0x00ff0000) >> 16;
byteResult[(i * 4) + 3] = (result[i] & 0xff000000) >> 24;
}
Where byteResult
is a byte[] 4 times larger than the result array. I'm assuming the order the bytes have been packed into the ints here, it may be the other way round.
Pass this byte[] into your Base64 encoder.
Upvotes: 1