Reputation: 5055
I have a server and a client running on 2 Unix machines. They can be two machines in a LAN or far apart and connected in VLAN. The client only receives packets and server only sends.(UDP or TCP)
How do I measure the latency between them programmatically?
One way of doing this is to add a timestamp on the packet before send, but the clocks are not guaranteed to be synced. Any suggestions?
Upvotes: 1
Views: 2690
Reputation: 993075
If your communications are strictly unidirectional and the clocks aren't synchronised, you can't do it.
You could introduce a new packet sent from the client to the server, that asks "what time is it?" The server would respond with its time, and the client would divide the response time by two to get the one-way latency. As a side benefit, the client can find out what time the server thinks it is.
Upvotes: 3