Reputation: 683
So, I borrowed a timer approach from this excellent post:
which was very well-written and well-upchecked. However, I find that it fires approximately every 106-114msec, not the desired 100msec. Does this make sense, or does it seem slow? If I wanted to make this closer to an exact 100msec (I am using it in some places to measure durations), what change should I make?
My code is below
Handler timerHandler = new Handler();
Runnable timerRunnable = new Runnable() {
@Override
public void run() {
TickTimer_Elapsed();
timerHandler.postDelayed(this, 100);
}
};
void TickTimer_Start() { timerHandler.postDelayed(timerRunnable, ); }
void TickTimer_Stop() { timerHandler.removeCallbacks(timerRunnable); }
void TickTimer_Elapsed()
{
m_FSM.Tick_10Hz(); // actually a bit slower than 10Hz
}
Upvotes: 0
Views: 443
Reputation: 54781
Timer is overloaded term in English, meaning either a device that measures time (e.g. a stopwatch), or a device that triggers after a time (e.g. egg timer).
In Android, the timer is for the latter only, and it does not promise absolute accuracy.
"I am using it in some places to measure durations"
In real life, to tell how much time has passed, you would not to watch a clock and count the seconds ticking by! You'd get nothing else done in that time. An efficient way would be to look at the clock just twice and subtract the two times. The same is true with computers:
e.g:
long startTimeMs = System.currentTimeMillis();
Later:
long durationMs = System.currentTimeMillis() - startTimeMs;
Upvotes: 1