anti
anti

Reputation: 3125

Equivalent system clock milliseconds in c++ and C#?

I am passing data from a c++ .dll through to a C# application using DllImport.

What i would like to do is time the data transfer time. So I would like to get the system time in milliseconds in the dll function, and then do the same again on the C# side, and get the difference between the two to calculate the time taken.

On the c++ side, I am sending a long that I am getting like this:

boost::posix_time::ptime current_date_microseconds = boost::posix_time::microsec_clock::local_time();
long millisecondStamp2 = current_date_microseconds.time_of_day().total_milliseconds();

I send that long through to C# as a variable named timestamp, and then run:

long milliseconds = DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond;
long elapsed = milliseconds - timestamp;

When I print the values they look like this:

63705280140098 //c#
54540098       //c++
63705225600000 // elapsed

Why are the c++ value and the C# value so different? How can I get equivalent values from the system clock in this way?

Upvotes: 2

Views: 831

Answers (1)

Peter Duniho
Peter Duniho

Reputation: 70671

Please ignore the comment that claims that .NET DateTime ticks are divided into two parts. That comment is not correct. The DateTime.Ticks property returns a tick count that has units of "one ten-millionth of a second", and which measures the number of such ticks from "0:00:00 UTC on January 1, 0001, in the Gregorian calendar". It is a straight integer value, with all of the bits contributing equally according to their significance in the value to the total.

Now, as far as the discrepancy in your result goes…

The C++ expression current_date_microseconds.time_of_day().total_milliseconds() is giving you the total milliseconds for the day. I.e. that's the total number of milliseconds since midnight (based on the value, appears you executed the code around 3PM local time).

On the other hand, the .NET expression using DateTime.Now is measuring milliseconds since the start of the epoch, i.e. since Jan 1, 0001.

The two values are not comparable at all. They represent two completely different time periods.

In theory, you could fix this problem by using instead, for the .NET side, DateTime.Now.TimeOfDay.TotalMilliseconds. This would get you a lot closer to the value you expected.

However…

It's not clear to me that there's any guarantee that the C++ POSIX API you're using will use exactly the same clock reference as the .NET API. Furthermore, even if it is, there is some overhead in the API itself, along with thread-scheduling perturbations that may introduce error into the calculation.

It seems to me that a much better approach would be for you on the .NET side to use the System.Diagnostics.Stopwatch class to measure the entire time that the call into the C++ DLL takes, and then in the C++ DLL, use your POSIX API to measure the time that the C++ code takes to execute and pass that back to the C# side.

Then the C# side can just subtract the C++ time from its own time, to determine roughly what the total overhead of the call was. (Making sure, of course, to use exactly the same units for each value…e.g. milliseconds.)

Even so, it's important to keep in mind:

  • If you return the C++ time value in the same call, that in and of itself could affect the total overhead of the call.
  • Some of the apparent overhead could be thread-scheduling effects. I.e. if your thread gets pre-empted during the call, then part of your measurement will be the time during which the thread was pre-empted.
  • At least on the .NET side, and probably on the C++ side as well, there are still limitations to the precision of the timing. The Stopwatch class is definitely more precise and preferable over DateTime, but if the overhead is small enough, you may not get useful results (but of course, if it's that small, then it's probably good enough to discover that it's too small to get useful results :) ).

Upvotes: 1

Related Questions