Reputation: 180
I've a class which is using java.util.Date
class to create a date object and using getTime()
to get current milliseconds.
I've seen in the Java documentation that getTime()
returns the milliseconds, and the same case is on my machine.
I've one other server, when I am deploying my application on server, the same getTime()
returns the timestamp in seconds.
e.g.
I am wondering how this is possible, I tried the same code locally and again I got timestamp in milliseconds.
Below is the part of code...
String longTime = new Long((new Date().getTime())).toString();
if(log.isDebugEnabled())log.debug("LAST_FEED_TIME will be " + longTime + " stored.");
Upvotes: 9
Views: 30035
Reputation: 1569
'new Date()' in turn uses System.currentTimeMillis()
System.currentTimeMillis
Returns the current time in milliseconds. Note that while the unit of time of the return value is a millisecond, the granularity of the value depends on the underlying operating system and may be larger. For example, many operating systems measure time in units of tens of milliseconds.
See the description of the class Date for a discussion of slight discrepancies that may arise between "computer time" and coordinated universal time (UTC).
source: http://docs.oracle.com/javase/1.4.2/docs/api/java/lang/System.html#currentTimeMillis()
Upvotes: 9