Reputation: 564
I am building a client application which needs to send packet to server in a particular format such that if the byte is 0xA3, the server expects it as {0x3A, 0x33}
I had used the below approach earlier. It works well if the byte is for instance 0x89. But if the byte is 0xA3 it doesnt work
string hex = hexStr .Length == 1 ? "0" + hexStr:hexStr ;
byte packet1 = (byte)(int.Parse(hex[0].ToString(), System.Globalization.NumberStyles.HexNumber) + 0x30);
byte packet2 = (byte)(int.Parse(hex[1].ToString(), System.Globalization.NumberStyles.HexNumber) + 0x30);
Examples of expected output
- input => 0x89 , Output => {0x38, 0x39}
- input => 0xA3 , Output => {0x3A, 0x33}
However if i use the above code i get the following output
- input => 0x89 , Output => {0x38, 0x39}
- input => 0xA3 , Output => {0x41, 0x33}
Upvotes: 0
Views: 253
Reputation: 63732
The problem isn't in the code you've shown.
You need to do all your math in hexadecimal, and convert to strings (if needed) into hexadecimal as well:
string hex = "A3";
byte packet1 = (byte)(int.Parse(hex[0].ToString(), NumberStyles.HexNumber) + 0x30);
byte packet2 = (byte)(int.Parse(hex[1].ToString(), NumberStyles.HexNumber) + 0x30);
Console.WriteLine("{0:X2}, {1:X2}", packet1, packet2); // 3A, 33
works exactly as you expect.
The results you got seem to indicate that you took 30
(decimal), added 0xA
(11), and printed as decimal, rather than hexa-decimal. This does not happen in the code you've posted, so just fix your actual code and you'll be fine.
Upvotes: 1