Reputation: 258
I have written a very small code to measure the time taken by my multiplication algorithm :
clock_t begin, end;
float time_spent;
begin = clock();
a = b*c;
end = clock();
time_spent = (float)(end - begin)/CLOCKS_PER_SEC;
I am working with mingw under Windows.
I am guessing that end = clock()
will give me the clock ticks at that particular moment. Subtracting it from begin
will give me clock ticks consumed by multiplication. When I divide with CLOCKS_PER_SEC
, I will get the total amount of time.
My first question is: Is there a difference between clock ticks and clock cycle?
My algorithm here is so small that the difference end-begin
is 0. Does this mean that my code execution time was less than 1 tick and that's why I am getting zero?
Upvotes: 7
Views: 19168
Reputation: 457
Answering the difference between clock tick and clock cycle from a systems perspective
Every processor is accompanied by a physical clock (usually quartz crystal clock), which oscillates at certain frequency (vibrations/sec). The processor keeps track of time by the help of interrupts generated from the physical clock, which interrupts the processor at every time period T
. This interrupt is called a 'clock tick'. CPU counts the number of interrupts it has seen since the system has started, and returns that value when you call clock()
. By taking a difference between two clock ticks values (obtained from clock()), you would get how many interrupts that were seen between those two time points.
Most of the modern operating systems program the T
value to be 1 microsecond i.e. the physical clock interrupts at every 1 microsecond, this is the lowest clock granularity which is widely supported by most of the physical clocks. With 1 microsecond as T
, the clock cycle can be calculated as 1000000 per second. So, with this information, you can calculate the time elapsed from the difference of two clock ticks values i.e. diff between two ticks * tick period
NOTE: clock cycle defined by the OS has to be <= vibrations/sec on the physical clock, otherwise there will be a loss of precision
Upvotes: 7
Reputation: 24
A clock cycle is a clock tick.
A clock cycle is the speed of a computer processor, or CPU, and is determined by the amount of time between two pulses of an oscillator. Generally speaking, the higher number of pulses per second, the faster the computer processor will be able to process information.
Upvotes: -1
Reputation: 33273
My first question is: Is there a difference between clock ticks and clock cycle?
Yes. A clock tick could be 1 millisecond or microsecond while the clock cycle could be 0.3 nanoseconds. On POSIX systems CLOCKS_PER_SEC
must be defined as 1000000
(1 million). Note that if the CPU measurement cannot be obtained with microsecond resolution then the smallest jump in the return value from clock()
will be larger than one.
My algorithm here is so small that the difference end-begin is 0. Does this mean that my code execution time was less than 1 tick and that's why I am getting zero?
Yes. To get a better reading I suggest that you loop enough iterations so that you measure over several seconds.
Upvotes: 5
Reputation: 12270
four your first question: clock ticks refer to the main system clock. It is the smallest unit of time recognized by the device. clock cycle is the time taken for a full processor pulse to complete. this u can recognize by your cpu cpeed given in Hz. a 2GHz processor performs 2,000,000,000 clock cycles per second.
for your second question: probably yes.
Upvotes: 4