Reputation: 1801
std::chrono::system_clock::time_point start;
//1 second passes
std::cout << (std::chrono::high_resolution_clock::now()-start).count();
The above code, after 1 second, in Visual Studio 2012 gives me 10000000
but on gcc 4.8.2 gives me 100000000
.
Changing the last line to std::chrono::duration_cast<std::chrono::microseconds>(std::chrono::high_resolution_clock::now()-start).count();
works as expected and gives me the same result on both compilers.
How is this possible?
Upvotes: 4
Views: 252
Reputation: 30577
According to http://en.cppreference.com/w/cpp/chrono/high_resolution_clock
Class std::chrono::high_resolution_clock represents the clock with the smallest tick period provided by the implementation.
So, GCC has a different resolution to VS.
The standard allows this because different systems have different requirements for time accuracy.
As you have already discovered there are methods of converting such a clock to a known resolution.
Upvotes: 4