EagerLearner
EagerLearner

Reputation: 41

How to set read timeouts between fixed packets of bytes?

I am developing an application in Visual C++ 2008 which is supposed to read some data received from a hardware device through a serial interface. The data is sent in packets of 45 bytes every 50 milliseconds. I am using the functions provided by the WIN32 API to do all the necessary tasks like open and close the ports.

This is my reference.

In it the author mentions the process of setting timeouts to wait for data

COMMTIMEOUTS timeouts = { 0 };
timeouts.ReadIntervalTimeout         = 50; // in milliseconds
timeouts.ReadTotalTimeoutConstant    = 50; // in milliseconds
timeouts.ReadTotalTimeoutMultiplier  = 10; // in milliseconds
timeouts.WriteTotalTimeoutConstant   = 50; // in milliseconds
timeouts.WriteTotalTimeoutMultiplier = 10; // in milliseconds

The first member of the COMMTIMEOUTS structure is the wait time between two successive bytes. This is supposed to be 0 for me as I receive 45 bytes in one go and then it has to be 50 milliseconds while it waits for the next packet.

While setting up the DCB structure, will setting the byte size to 8*45 bits and then setting the ReadIntervalTimeout to 50 milliseconds work?

EDIT: It won't. I just read the definition of the byte size to be between 4 and 8. Any other way I could handle this?

Also can I set event handlers with these timeouts? I want to be able to keep the application running, while it reads only when it detects data. But I am afraid if on start the hardware device has already sent say 20 of the 45 bytes, the application might consider byte 21 as the first byte. Is this fear unfounded?

EDIT 2: I have modified my read function to the following to account for checking for the appropriate header

void CGCUGUIDlg::fnRead()
{
    char TempChar; //Temporary character used for reading
    char SerialBuffer[45];//Buffer for storing Rxed Data
    DWORD NoBytesRead;
    int i = 0;
    do
    {
        ReadFile( m_hComm,           //Handle of the Serial port
             &TempChar,       //Temporary character
             sizeof(TempChar),//Size of TempChar
             &NoBytesRead,    //Number of bytes read
             NULL);

        if(TempChar == 0x10||0x80)
        {SerialBuffer[i] = TempChar;// Store Tempchar into buffer
        i++;}
    }

    while (NoBytesRead > 0);
}

Upvotes: 0

Views: 563

Answers (1)

In your receiving app you need to implement your protocol - your code needs to start off "unsynchronized" which means in your case it looks for a header and footer the correct number of bytes apart, and throws away data until those are found. When they are found, the protocol is "synchronized" and the packet is sent on for consumption, and the protocol always keeps checking for the header/footer - if found, the data is accepted as a packet, if not found the state reverts to unsynchronized. Your code looking for a packet should accumulate data and once synchronized must only consume the packet length off it, i.e. if you receive 46 bytes, consume the 45 then keep adding data to the remaining 1 byte until you have 45 or more, and don't discard the more. When you get that working you won't need to worry about the gap between packets. The usual approach is that you maintain a buffer to which received bytes are added, and from which data is consumed leaving what hasn't been consumed to be added to as the next packet - this way you don't need your code to worry about timeouts between packets, or unexpected processing delays, etc.

Upvotes: 1

Related Questions