Heinrich
Heinrich

Reputation: 2214

Stopwatch.GetTimestamp() produced different results on linux vs windows

I am hoping someone can explain to my why the function below produces incredibly different results on linux vs windows.

If I have this little snippet of code:


var elapsed = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond;

Thread.Sleep(1001); // lets sleep for one second

var ts = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond; 
var result = ts  - elapsed  > 10000L // roughly 10 seconds

On a windows environment result is false <-- expected result

But on a linux environment result is true <-- what....why?

I have read that Stopwatch.GetTimestamp() is dependent on the processor. But this seems excessive.

As far as I can tell GetTimestamp produces wildly different values on Windows vs Linux.

e.g. in my case running the code above

On Windows Stopwatch.GetTimestamp() produces a value of roughly in range of 165100732

On Linux Stopwatch.GetTimestamp() produces a value more than a 200x bigger e.g. 349232049523

So I can see why result is different i.e. on Windows it records a elapsed duration of 1 second but on Linux is records an elapsed duration of close to 100 seconds. So that part is fine.

So question boils down to why does Stopwatch.GetTimestamp() produce such wildly different numbers between the two environments?

Upvotes: 2

Views: 890

Answers (1)

Heinrich
Heinrich

Reputation: 2214

After doing a bit more digging I found that I was performing the millisecond calculation incorrectly when using Stopwatch.GetTimestamp().

After updating the code as below it started to behave consistently.

var elapsed = Stopwatch.GetTimestamp() / (Stopwatch.Frequency / 1000);

Thread.Sleep(1001); // lets sleep for one second

var ts = Stopwatch.GetTimestamp() /  (Stopwatch.Frequency / 1000); 
var result = ts  - elapsed  > 10000L // roughly 10 seconds

The answer is thanks to comment by Jens on this answer

Upvotes: 3

Related Questions