Reputation: 2145
I am having a problem with using the ManualResetEvent class with a timeout parameter. This problem occurs specifically on the WinXP embedded platform. The code works perfectly on other windows platforms. I am communicating with a TCP server. In my client code, I connect to the server and spawn a new thread wose job is to continuously monitor the receive socket for data. I send the data in the main thread. The code snippet is attached below :
internal void initSocket()
{
.....
.....
if (socket.Connected)
{
Tracing.info("Connected to server");
ReceiveThread = new Thread(new ThreadStart(StartReceiving));
ReceiveThread.Start();
}
}
/// <summary>
/// Sends a request to Server and waits for its response.
/// </summary>
/// <param name="msg"></param>
/// <param name="timeout">Timeout time, when </param>
/// <returns></returns>
private CdcMessage sendSync(CdcMessage msg, int timeout)
{
resultMessage = null;
// store current messageId...
resultMessagePackageId = msg.MessageId;
String msgToSend = msg.serialize();
Tracing.debug("SEND : >> " + msgToSend);
socketWriter.WriteLine(msgToSend);
// Wait for response from read thread...
resultReceivedEvent = new ManualResetEvent(false);
bool bReponseSent = resultReceivedEvent.WaitOne(timeout);
if (!bReponseSent)
{
resultMessage = null;
}
return resultMessage;
}
/// <summary>
/// Thread function which continuously checks for the
/// data from server. It will read the data only if it
/// is available
/// </summary>
public void StartReceiving()
{
while (Connected)
{
try
{
Thread.Sleep(100);
String response = socketReader.ReadLine();
Tracing.info("Raw data received = " + response);
resultMessage = CdcMessage.deserialize(response);
Tracing.info("Deserialized response = " + resultMessage);
if (resultMessage == null)
{
continue;
}
else if (resultMessage.IsHeartbeat)
{
Tracing.debug("Heartbeat");
socketWriter.WriteLine(response);
}
else if (!resultMessage.MessageId.Equals(resultMessagePackageId))
{
// not the correct package id...reject...
Tracing.warn("REJECTED: Package-ID: " + resultMessage.MessageId);
continue;
}
else
{
resultReceivedEvent.Set();
Tracing.info("StartReceiving() : Received data");
Tracing.debug("RECEIVED: >> " + response);
}
}
catch (NullReferenceException nre)
{
Tracing.error("StartReceiving(): Socket doesn't exist!", nre);
close();
break;
}
catch (ObjectDisposedException ode)
{
Tracing.error("StartReceiving(): Socket is disposed!", ode);
close();
break;
}
catch (IOException ex)
{
Tracing.error("StartReceiving(): Socket IO-Exception!", ex);
close();
break;
}
}
}
I have highlighted the important aspects of code. It is observed that WaitOne(timeout) function works without any problems on most of the windows OS's. But on XP embedded, I observe a problem. The WaitOne almost returns immediately with no data received from the receive thread.
What I did was I made the timeout as INFINITE by passing -1 to WaitOne. In this case, I could solve the problem. But this creates other side effects (e.g. if the server was shutdown, then WaitOne never returns!)
Can someone please help me in solving this issue ?
Upvotes: 0
Views: 216
Reputation: 17584
I'm not sure I understand your code correctly, but the lines
socketWriter.WriteLine(msgToSend);
resultReceivedEvent = new ManualResetEvent(false);
bool bReponseSent = resultReceivedEvent.WaitOne(timeout);
look strange to me. I think this would be better:
resultReceivedEvent.Reset();
socketWriter.WriteLine(msgToSend);
bool bReponseSent = resultReceivedEvent.WaitOne(timeout);
There may be a potential race condition if an old ManualResetEvent
gets set before the new one is created. There doesn't seem to be a reason to create a new instance of ManualResetEvent
here. Just call Reset
on the old instance, and make sure you reset it before sending the message.
Upvotes: 2