Reputation: 1006
using .net sockets. I am trying to determine whether the socket is connected or not. there for I expect the send operation to fail if the remote endpoint is not connected. practically I do not get an exception on the send but the send passes successfully only after a while the send will fail. is there any explanation for this? what else can i do in order to get the connection state?
my code is very simple based on this `
Int32 port = 13000;
TcpClient client = new TcpClient(server, port);
// Translate the passed message into ASCII and store it as a Byte array.
Byte[] data = System.Text.Encoding.ASCII.GetBytes(message);
// Get a client stream for reading and writing.
// Stream stream = client.GetStream();
NetworkStream stream = client.GetStream();
// Send the message to the connected TcpServer.
stream.Write(data, 0, data.Length);
Console.WriteLine("Sent: {0}", message);
// Receive the TcpServer.response.
// Buffer to store the response bytes.
data = new Byte[256];
// String to store the response ASCII representation.
String responseData = String.Empty;
// Read the first batch of the TcpServer response bytes.
Int32 bytes = stream.Read(data, 0, data.Length);
responseData = System.Text.Encoding.ASCII.GetString(data, 0, bytes);
Console.WriteLine("Received: {0}", responseData);
// Close everything.
client.Close(); `
Upvotes: 0
Views: 586
Reputation: 310980
There is no 'connection state' in TCP. The only reliable way to detect a broken connection is to try to write to it repeatedly: eventually a broken connection will cause ECONNRESET or whatever that maps to in your programming language. You should also use a read timeout when reading, of some value like double the expected latency of the longest request, and treat that as a broken connection or at least a broken transaction as well, but this is problematic if transaction latencies vary wildly.
Upvotes: 1