user3548808
user3548808

Reputation: 103

Linux c++ Convert seconds (double) to miliseconds, microseconds, nanoseconds, picoseconds

I have device which gives me delay in seconds (char*), value like 0.000003650000, I have to convert that value to milliseconds, microseconds, nanoseconds, or picoseconds. I wrote an application, in qt-creator in linux (c++). I tried to use library chrono, but as I see it is keeping only long values, for each type and I always loose some data. Which method is the best to do this?

Upvotes: 0

Views: 2663

Answers (2)

Howard Hinnant
Howard Hinnant

Reputation: 219508

Here's how you could do this using <chrono> plus a few simple utilities found here:

#include "date/date.h"
#include <chrono>
#include <iostream>
#include <string>

using picoseconds = std::chrono::duration<long long, std::pico>;
using ldseconds = std::chrono::duration<long double>;

picoseconds
convert(const char* str)
{
    return date::round<picoseconds>(ldseconds{std::stold(std::string{str})});
}

int
main()
{
    auto t = convert("0.000003650000");
    using namespace date;
    std::cout << t << '\n';
    using namespace std::chrono;
    std::cout << duration_cast<nanoseconds>(t) << '\n';
    std::cout << duration_cast<microseconds>(t) << '\n';
    std::cout << duration_cast<milliseconds>(t) << '\n';
}

This outputs:

3650000ps
3650ns
3µs
0ms

Detailed explanation

First you need some custom chrono::durations:

  1. picoseconds, and
  2. a second based on long double.

These are easily built as shown for the type aliases picoseconds and ldseconds. The first template parameter is the representation you want to use. The second is a std::ratio that represents a compile-time rational number that is the tick-period of your duration. If you don't supply one (as in ldseconds), the default is ratio<1, 1>, which means seconds.

With these two custom durations, we can now easily write the convert function. It should input the const char* (assumes null-terminated) and output the finest duration you anticipate (picoseconds in your example).

Use std::stold to convert a std::string holding the number into a long double. Then convert that long double into ldseconds. Finally convert the ldseconds into picoseconds using the chrono round utility found in "date/date.h":

round is also now in C++17. So if you have C++17, you can say std::chrono::round instead, which removes the need for "date/date.h" for this function.

Once the client gets the picosecond result, he can convert it to whatever units he wants with duration_cast (or in C++17, round, floor, or ceil).

I've used "date/date.h" just to make it easier to print the values out. If you prefer, you can print them out without this utility like so:

std::cout << t.count() << "ps\n";

Upvotes: 3

Convert the char* to a std::string. Check that there are exactly 12 digits after the decimal point. Convert those to a unsigned long long number of picoseconds with std::strtoull(digits,10); (remember to specify the base explicitly, otherwise the leading zero will make it think it is an octal number).

If you need to handle delays >= 1s, convert the digits before the decimal point to a number of seconds in the same way and add them into the picoseconds with pico += seconds*(1000ull*1000*1000*1000);

Finally, nanoseconds = (picoseconds + 500)/1000;

Upvotes: 4

Related Questions