Reputation: 1841
I'm looking for way to measure service call response time in Fiddler so it included all stages of process (creating request, serializing it to xml, sending, getting response, deserializing it). Like this:
var start = DateTime.Now;
// client is auto-generated C# SoapHttpClientProtocol proxy for WCF service
var response = client.GetWebMethod();
var finish = DateTime.Now;
var elapsed = (finish - start).TotalMilliseconds;
Documentation proposes using difference between ClientDoneRequest and ClientDoneResponse timers:
var elapsed = (oSession.Timers.ClientDoneResponse - Session.Timers.ClientDoneRequest).TotalMilliseconds;
Results I'm getting differ around 100%, and Fiddler's values are surprisingly two times smaller (expected vice versa as it's proxy that had to have some overhead for passing requests).
It's more like I'm looking for ClientDoneResponse - Client*Begin*Request here, but values for both of this timers (ClientStartRequest and ClientDoneRequest) are absolutely equal in my case. Any ideas how to get at least approximately close numbers in Fiddler? Thanks in advance.
edit
Tried ClientBeginRequest, it doesn't work at all.
Upvotes: 0
Views: 1491
Reputation: 57095
In your code, you're using DateTime.Now
, which is limited to the Windows clock resolution (15.7ms). For higher precision, you should use the Stopwatch
class instead.
I don't understand enough about what you're trying to measure with Fiddler's timers. ClientDoneResponse - ClientBeginRequest
measures the time between the client sending the first TCP/IP packet to Fiddler and the time of Fiddler sending the final TCP/IP packet to the client.
Upvotes: 1