Checksum check of modular sum in C #

I am working with an embedded system that returns ASCII data that includes (what I believe to be) a modular checksum sum . I would like to verify this checksum, but I was unable to do so based on the manufacturers spec. I was also unable to accomplish the opposite and figured out the same checksum based on the description.

Each response from the device has the following format:

╔═════╦═══════════════╦════════════╦════╦══════════╦═════╗
║ SOH ║ Function Code ║ Data Field ║ && ║ Checksum ║ ETX ║
╚═════╩═══════════════╩════════════╩════╩══════════╩═════╝

      

Example:

SOHi11A0014092414220&&FBEA

      

Where SOH is ASCII 1.for example,

#define SOH "\x01"

      

The checksum description looks like this:

The checksum is a series of four ASCII-hexadecimal characters that verify the integrity of all preceding characters, including the characters control. The four characters represent a 16-bit binary count, which is the 2's complement of the 8-bit binary representation of the message characters after the parity bit (if enabled) has been cleared. Overflow is ignored. Data integrity checking can be performed by converting the four characters of the checksum to a 16-bit binary number and appending to it the 8-bit binary representation of the message characters. The binary result must be zero.

I've tried several different interpretations of the spec, including ignoring SOH as well as ampersands and even function code. At this point, I must be missing something very obvious in my interpretation of the spec or in the code I used for testing. Below you will find a simple example (the data was taken from a live system), if it was correct, the bottom word in the variable validate would be 0:

static void Main(string[] args)
{
    unchecked
    {
        var data = String.Format("{0}{1}", (char) 1, @"i11A0014092414220&&");
        const string checkSum = "FBEA";

        // Checksum is 16 bit word
        var checkSumValue = Convert.ToUInt16(checkSum, 16);

        // Sum of message chars preceeding checksum
        var mySum = data.TakeWhile(c => c != '&').Aggregate(0, (current, c) => current + c);
        var validate = checkSumValue + mySum;

        Console.WriteLine("Data: {0}", data);
        Console.WriteLine("Checksum: {0:X4}", checkSumValue);
        Console.WriteLine("Sum of chars: {0:X4}", mySum);
        Console.WriteLine("Validation: {0}", Convert.ToString(validate, 2));
        Console.ReadKey();
    }
}

      

Edit

While the solution provided by @tinstaafl works for this particular example, it does not work when providing a larger entry such as:

SOHi20100140924165011000007460904004608B40045361000427DDD6300000000427C3C66000000002200000745B4100045B3D8004508C00042754B900000000042774D8D0000000033000007453240004531E000459F5000420EA4E100000000427B14BB000000005500000744E0200044DF4000454AE000421318A0000000004288A998000000006600000744E8C00044E7200045469000421753E600000000428B4DA50000000 &&

BA6C

      

In theory, you can keep increasing / decreasing the value in the string until the checksum matches, it just so happens that using character 1 rather than the ASCII SOH control character gave it only the correct value, match in this case.

+3


source to share


1 answer


Not sure if this is exactly what you are looking for, but using integer 1 for SOH instead of char 1 value, taking the sum of all characters and converting the variable validate

to a 16-bit integer, I was able to get a confirmation of 0:



var data = (@"1i11A0014092414220&&");
const string checkSum = "FBEA";

// Checksum is 16 bit word
var checkSumValue = Convert.ToUInt16(checkSum, 16);

// Sum of message chars preceeding checksum
var mySum = data.Sum<char>(c => c);
var validate = (UInt16)( checkSumValue + mySum);

Console.WriteLine("Data: {0}", data);
Console.WriteLine("Checksum: {0:X4}", checkSumValue);
Console.WriteLine("Sum of chars: {0:X4}", mySum);

Console.WriteLine("Validation: {0}", Convert.ToString(validate, 2));
Console.ReadKey();

      

+1


source







All Articles