Reputation: 341
I've been reading for days about timezones and offsets but I can't seem to figure this out.
I have a quiz application in which the client has 5 minutes to solve a quiz. When the client sends a request to start the quiz, the server records the start_time
of the quiz as that instant, and the end_time
as start_time + 5min
.
This end_time
is sent to the browser, which then shows a countdown timer to the client by calculating the difference between the browser's current_time
and the end_time
.
The problem is that the browser's current_time
is different on different devices, even if they are in the same timezone. If a device is 2 minutes ahead of the standard time of its timezone, then its quiz ends 2 minutes earlier than every other device.
I tried using UTC time universally, on the server as well as the client, however, at the same exact instant, luxon.DateTime.utc()
returns different a different time on each device. My phone's time is 2 minutes ahead of my laptop's time, and so DateTime.utc()
returns, on my phone, a time that is 2 minutes ahead of that on my laptop. I don't want the device's time converted to UTC, I want an absolute time frame of reference that doesn't vary with device time.
Upvotes: 1
Views: 1557
Reputation: 341
I did not find the answer to how the method I mentioned above can be implemented, but I found another way to implement the timer.
Instead of the server sending the end_time
to the browser, the server sends the remaining_time
which is end_time-server_current_time
. The client receives this remaining_time
and sets the countdown for client_current_time+remaining_time
.
Upvotes: 2