Reputation: 572
NetworkStream serverStream = clientSocket.GetStream();
byte[] inStream = new byte[10025];
serverStream.Read(inStream, 0, (int)clientSocket.ReceiveBufferSize);
string[] delim = {"!###!"};
string[] returndata = System.Text.Encoding.ASCII.GetString(inStream).Split(delim, StringSplitOptions.RemoveEmptyEntries);
returndata[0] is supposed to be a string of 2 chars "fs". Using switch() or if() it never match "fs". "fs" is supposed to be 2 length, when i try returndata[0].length it says 10025... But when i Debug.WriteLine() its displaying "fs" only.. please help.
edit: datas received are sent like this:
byte[] outStream = System.Text.Encoding.ASCII.GetBytes("fs!###!somethingsblabalkla");
serverStream.Write(outStream, 0, outStream.Length);
serverStream.Flush();
Upvotes: 0
Views: 118
Reputation: 63722
Your byte buffer is 10025. That's final, it never changes. The amount of bytes received is in the return value of the Read
method, which you're completely ignoring. The string you get from GetString
thus has all the 10025 characters, and only when you print them out are the zeroes ignored.
In any case, this is not proper TCP. It will not work anyway, not reliably. Try to avoid writing your TCP code unless you really know what you're doing - have a look at WCF, HTTP, or Lindgren - it will probably be a lot better idea than doing your own TCP code.
Upvotes: 2