Reputation: 65
I want to measure time that a Java program needs to do something.
In book only the java.util.Date
class was mentioned, so I assumed it was correct (or correct enough) to use it for solving this problem.
I have the following code:
time.start();
for(int i=0;i<array.length;i++)
array[i]=(int)(Math.random()*100000);
time.stop();
Where time.start() and time.stop are (in Java class):
Date timeMeasured = new Date();
public void start(){
startTime = timeMeasured.getTime();
}
public void stop(){
endTime = timeMeasured.getTime();
}
After some research I find out that I should use System.currentTimeMillis() to measure time, and after trying it was working. My code just returned 0 every time.
Can someone explain why, and can you correct my code, no matter that it is wrong way to do it?
Upvotes: 1
Views: 100
Reputation: 2299
When you create a Date object with default constructor, he will represent the time of creation.
So any call of getTime()
will return the date of the creation of the object, NOT the current time.
Your code would work if instead of creating a global object you would had instantiate it every time before use, but then the time needed to create the object may screw up your calculation.
Upvotes: 1