Reputation: 138
I'm working with an embedded system that returns ASCII data that includes (what I believe to be) a modular sum checksum. I would like to verify this checksum, but I've been unable to do so based on the manufacturers specification. I've also been unable to accomplish the opposite and calculate the same checksum based off the description.
Each response from the device is in the following format:
╔═════╦═══════════════╦════════════╦════╦══════════╦═════╗
║ SOH ║ Function Code ║ Data Field ║ && ║ Checksum ║ ETX ║
╚═════╩═══════════════╩════════════╩════╩══════════╩═════╝
Example:
SOHi11A0014092414220&&FBEA
Where SOH is ASCII 1. e.g.
#define SOH "\x01"
The description of the checksum is as follows:
The Checksum is a series of four ASCII-hexadecimal characters which provide a check on the integrity of all the characters preceding it, including the control characters. The four characters represent a 16-bit binary count which is the 2's complemented sum of the 8-bit binary representation of the message characters after the parity bit (if enabled) has been cleared. Overflows are ignored. The data integrity check can be done by converting the four checksum characters to the 16-bit binary number and adding the 8-bit binary representation of the message characters to it. The binary result should be zero.
I've tried a few different interpretations of the specification, including ignoring SOH as well as the ampersands, and even the function code. At this point I must be missing something very obvious in either my interpretation of the spec, or the code I've been using to test. Below you'll find a simple example (data was taken from a live system), if it were correct, the lower word in the validate variable would be 0:
static void Main(string[] args)
{
unchecked
{
var data = String.Format("{0}{1}", (char) 1, @"i11A0014092414220&&");
const string checkSum = "FBEA";
// Checksum is 16 bit word
var checkSumValue = Convert.ToUInt16(checkSum, 16);
// Sum of message chars preceeding checksum
var mySum = data.TakeWhile(c => c != '&').Aggregate(0, (current, c) => current + c);
var validate = checkSumValue + mySum;
Console.WriteLine("Data: {0}", data);
Console.WriteLine("Checksum: {0:X4}", checkSumValue);
Console.WriteLine("Sum of chars: {0:X4}", mySum);
Console.WriteLine("Validation: {0}", Convert.ToString(validate, 2));
Console.ReadKey();
}
}
While the solution provided by @tinstaafl works for this particular example, it doesn't work when providing a larger record such as the below:
SOHi20100140924165011000007460904004608B40045361000427DDD6300000000427C3C66000000002200000745B4100045B3D8004508C00042754B900000000042774D8D0000000033000007453240004531E000459F5000420EA4E100000000427B14BB000000005500000744E0200044DF4000454AE000421318A0000000004288A998000000006600000744E8C00044E7200045469000421753E600000000428B4DA50000000&&
BA6C
Theoretically you could keep incrementing/decrementing a value in the string until the checksum matched, it just so happened that using the character 1 rather than the ASCII SOH control character gave it just the right value, a coincidence in this case.
Upvotes: 3
Views: 1727
Reputation: 6948
Not sure if this is exactly what you're looking for, but by using an integer of 1 for the SOH instead of a char value of 1, taking the sum of all the characters and converting the validate
variable to a 16 bit integer, I was able to get validate to equal 0:
var data = (@"1i11A0014092414220&&");
const string checkSum = "FBEA";
// Checksum is 16 bit word
var checkSumValue = Convert.ToUInt16(checkSum, 16);
// Sum of message chars preceeding checksum
var mySum = data.Sum<char>(c => c);
var validate = (UInt16)( checkSumValue + mySum);
Console.WriteLine("Data: {0}", data);
Console.WriteLine("Checksum: {0:X4}", checkSumValue);
Console.WriteLine("Sum of chars: {0:X4}", mySum);
Console.WriteLine("Validation: {0}", Convert.ToString(validate, 2));
Console.ReadKey();
Upvotes: 1