Reputation: 1151
I'm trying to get the 16 bit CRC CCITT FALSE. I'm using this page to check it.
http://www.sunshine2k.de/coding/javascript/crc/crc_js.html
And this is my code
unsigned int16 crc16_CCITT(unsigned int8 *data, unsigned int16 len)//CRC16 CCITT False
{
unsigned int16 crc = 0xFFFF;
for(unsigned int16 j = len; j > 0; j--)
{
crc ^= *data++;
for(unsigned int8 i = 0; i < 8; i++)
{
if(crc & 1)
{
//crc = (crc >> 1) ^ 0x8401; // 0x8401 is the reflection of 0x1021
crc = (crc >> 1) ^ 0x1021;
}
else
{
crc >>= 1;
}
}
}
return (crc);
}
As you can see I already tried by reflecting the polynomial and didn't work either.
I don't understand what I'm doing wrong, I already used this routine with the 16bit ARC CRC.(0x8005) and works ok.
Upvotes: 0
Views: 253
Reputation: 41170
Try shifting the bits the other way:
uint16_t crc16_CCITT (unsigned char *ptr, int count)
{
uint16_t crc = 0xffff;
int i = 0;
while (--count >= 0)
{
crc = crc ^ (uint16_t )*ptr++ << 8;
for (i = 0; i < 8; ++i)
{
if (crc & 0x8000)
{
crc = (crc << 1) ^ 0x1021;
}
else
{
crc = crc << 1;
}
}
}
return crc;
}
Upvotes: 1
Reputation: 373
unsigned int16
and unsigned int8
are ambiguous. It's better to change them to uint16
,uint8
or unsigned short
,unsigned char
. In many header files, int16 is defined as signed short and int8 is defined as signed char.
Upvotes: 0