Issue
I am trying to measure the ping of a socket connection.
Like so;
Server: Sends ping to client
Client: Receives ping and encodes current time as response
Server: Decode response and calculate ping as = current time - response time
In theory this should tell me a fairly accurate measurement of the amount of time it takes for data to transfer from the client -> server.
The ISSUE is the encoded time (millis) from the client (linux VM) is ~4s before the time cached on the server when sending the ping.
It would appear that Instant.now() is returning inconsistent results across machines.
I've confirmed this by simply outputting Instant.now().toEpochMilli()
Running both test at the "same time", the time on the VM is several seconds behind? What is going on here?
Solution
Thank you @aatwork for your information, I've solved my problem.
The issue has come from my lack of understanding of how time and UTC works on a global scale and that any induvial machine will likely have an offset.
To correct for this, one should poll the time from an NTP server. Doing so will allow for synchronized results across multiple machines.
Dependency: Apache Commons Net
List of Top Public Time Servers
Answered By - Eric Ballard
Answer Checked By - Marilyn (JavaFixing Volunteer)