Reputation: 42175
I'm using the SerialDataReceivedEventHandler
to know when my SerialPort
object receives data.
Is it possible to know when I won't be receiving anymore data because the inter-byte timeout has occurred?
Upvotes: 1
Views: 1134
Reputation: 1396
The short answer is you could but you should't.
There is no built-in way of doing that. However you could setup a timer (like Hans mentionned) in your SerialDataReceivedEventHandler and reset this timer every time you receive characters. If the timer expires, you can figure out that you might not be receiving anymore data.
This being said, serial communication has its quirks. The fact that the timer expires does not imply that the other system is over with the transmission of data. There could be some flow control mechanism in between that would have one system wait for another one, there could be some delay on one end or another. (if your system is busy, it might buffer serial data before firing the event)
Usually, end of transmission is determined when some specific character is received (typically CR/LF) which is a lot more robust than waiting for a specified timeout
Upvotes: 1
Reputation: 941347
No, a TimeoutException is only raised when you call Read() and there is no data to read. The DataReceived event is fired when there is data available so the Read() call can never raise that exception.
A simple workaround is to use your own Timer. Call its Stop() and Start() method in your event handler to reset the timer. And declare failure when its Tick event fires.
Upvotes: 1