Reputation: 7859
I'm calculating the difference in minutes between two date-time in that way:
long diff = time2.getTime() - time1.getTime();
return diff / (60 * 1000);
For a strange reason it returns a lower number of minutes than expected.
For example the difference between 2014-01-22 18:45:00
and 2014-01-22 18:03:00
is 41 minutes
when it should be 42 minutes
What's the reason? How to fix it?
EDIT
The debugger show "2014-01-22 18:45:00.0" for date1 and "Wed Jan 22 18:03:31 GMT 2014" for date2. I don't know why it's showing different syntaxes.
Upvotes: 1
Views: 93
Reputation: 1427
Like Oracle explains...
public long getTime()
Returns the number of milliseconds since January 1, 1970, 00:00:00 GMT represented by this Date
It's a rounding problem. If you round the two 'times' (both upper or lower) you won't have any problem, but if not both will take in account the ms and your result can vary 1 minute (your minimum grain of detail). I would use the class double
instead of long as @Nguyten Le has said.
Upvotes: 0
Reputation: 58
Its rounding problem. Long truncates the decimal, you should use Double in this case if u want the exact number.
Upvotes: 1
Reputation: 32391
If you try to divide by 60f * 1000
, you will see a result of 41.something because getTime()
takes into consideration the milliseconds as well, and as you can see these are not zeros in your date values.
Upvotes: 1