Reputation: 1097
I have written a TimeValidityCheckUtil
which determines if the snapshot from a particular listener is within a given time-frame of the current time of the device. This has this method, checkIfTImeValid
public static boolean checkIfTimeValid(Timestamp dbTimestamp)
{
Timestamp currentTimestamp = Timestamp.now();
Long seconds = currentTimestamp.getSeconds() - dbTimestamp.getSeconds();
if(seconds> 10)
{
return false;
}
return true;
}
The database structure is in Firestore and is as follows:
"ABC"-|
|
|-"documentId"-|
|
|-"some_key" - "string"
|-"timestamp" - timestamp
This is what happens, device A creates a documentId and the object with the timestamp.
Device B listens to this documentId and invokes the checkIfTimeValid
to check if the the operation by document A was within 10s of the current Timestamp (to check if it's recent)
Even if this process is happening instantly, the device shows the difference between the timestamps as ~57-62s which according to me should not be more than 1-5s.
Why this is happening?
Upvotes: 1
Views: 1040
Reputation: 26171
Timestamp.now()
is calculated using the local device's time. If the device clock is out of sync with Firebase, that difference will be reflected here.
I am not aware of the Firestore equivalent, but in the RTDB, you can use special location /.info/serverTimeOffset
to estimate the difference between clocks in milliseconds.
firebase.database().ref('/.info/serverTimeOffset').once('value')
.then((snap) => {
var offset = snap.val();
console.log('Server time delta in ms: ', offset);
// var estimatedServerTimeMs = new Date().getTime() + offset;
})
.catch(console.error)
Upvotes: 1