Reputation: 15984
I want to measure the time it takes for a user to complete a task (answer a quiz). I want to measure it accurately, without the network lag. Meaning, if I measure on the server side the time between 2 requests, it won't be the real time it took the user, because the network time is factored in.
But on the other hand, if I measure in javascript and post the timestamps to the server, the user will be able to see the code, and cheat by sending false timestamps, no?
How can I get the timestamps in javascript and make sure the user doesn't fake it?
Upvotes: 0
Views: 80
Reputation: 404
I faced same problem while designing an online examination portal for my project. I went for a hybrid approach.
This way you may not prevent user completely from modifying timer, but you can limit max possible error to 30s.
Upvotes: 1
Reputation: 2609
The trick here would be to measure the time using JavaScript, but also keep track of it using server-side code. That way, you can rely on the timestamps received by the client as long as you enforce a maximum difference between calculated times. I'd say a few seconds should be good enough. However, by doing so, you are creating an additional vector for failure.
Edit: A user could potentially tweak his or her time in their favor by up to the maximum enforced difference if they are able to take advantage of the (lack of) network lag.
Upvotes: 3
Reputation: 15425
Generally in client side code, any question that starts off with "How to securely..." is answered with "Not possible". Nothing, not even putting variables in a closure (because I, the evil cheating user could just change the code on my end and send it back to you).
This is the kind of validation that should be performed server side, even with the disadvantage of network latency.
Upvotes: 4