TechyAdam
TechyAdam

Reputation: 117

Measuring latency

I'm working on a multiplayer project in Java and I am trying to refine how I gather my latency measurement results.

My current setup is to send a batch of UDP packets at regular intervals that get timestamped by the server and returned, then latency is calculated and recorded. I take number of samples then work out the average to get the latency.

Does this seem like a reasonable solution to work out the latency on the client side?

Upvotes: 6

Views: 4546

Answers (4)

Kiran Prabhu
Kiran Prabhu

Reputation: 51

If you are measuring roundtrip latency, factors like clock drift, precision of HW clock and OS api would affect your measurement. Without spending money on the hardware the closest that you can get is by using RDTSC instructions. But RDTSC doesnt go without its own problems, you have to be careful how you call it.

Upvotes: 0

Heisenbug
Heisenbug

Reputation: 39164

You could also timestamp packets used in your game protocol . So you will have more data to integrate your statistics. (This method is also useful to avoid the overhead caused by an additional burst of data. You simply used the data you are already exchanging to do your stats)

You could also start to use other metrics (for example variance) in order to make a more accurate estimation of your connection quality.

Upvotes: 3

Chris Dennett
Chris Dennett

Reputation: 22721

If you haven't really started your project yet, consider using a networking framework like KryoNet, which has RMI and efficient serialisation and which will automatically send ping requests using UDP. You can get the ping time values easily.

Upvotes: 0

NPE
NPE

Reputation: 500167

I would have the client timestamp the outgoing packet, and have the response preserve the original timestamp. This way you can compute the roundtrip latency while side-stepping any issues caused by the server and client clocks not being exactly synchronized.

Upvotes: 15

Related Questions