Reputation: 23135
I have a thread which pushes data into a queue and another thread which reads data from queue and processes it. I would like to check how long the data sits in the queue before getting processed.
I added a time parameter (calculated using System.nanoTime()
) in the data before being pushed by the first thread. Once the second thread processes it, it will calculate System.nanoTime()
and find the difference from the previous time set in the data.
Would this work properly? I am asking this because I am seeing negative difference in the logs.
UPDATE
I would like to clarify that, the start time is put by a process in a different machine and the difference is calculated in a different machine.
Upvotes: 6
Views: 991
Reputation: 533790
I have used System.nanoTime() between threads and processes. On a single machine it is both global and monotonically increasing (with the exception of multi-socket Windows XP)
If you see a negative difference, most likely it is a bug in your code.
You can see nanoTime() between machines, but you have to adjust for the difference and drift between clocks. (And you can get large very negative results if you don't do this correction)
the start time is put by a process in a different machine and the difference is calculated in a different machine.
Between machines you need to either
If you are only interested in multi-milli-second delays I would use currentTimeMillis()
Upvotes: 5