Reputation: 1864
A lot of people here have asked how to convert unsigned/signed integer/long to C string.
And the most common answer is use sprintf (or snprintf). However, it requires different format strings for different types (e.g. uint32_t, int32_t, uint64_t, int64_t etc).
And I have a function template like this:
// T can only be uint16_t, int16_t, uint32_t, int32_t, uint64_t, int64_t
template <type T>
void foo(char* buffer, T value) {
// write value to buffer
}
I can specialize the function template to solve my problem.
I just wonder whether there is a more elegant and efficient way (i.e. without a temporary buffer like stringstream).
Thanks!
Upvotes: 1
Views: 1663
Reputation: 16843
Just use sprintf or itoa (non portable):
char* append_num(char* buf, int n)
{
return buf + sprintf(buf, "%d", n);
}
some std::to_string implementations actually use sprintf and copy result to a new std::string.
Here's something that could be considered well optimized. The difference from regular itoa is that it does twice less integer divisions which aren't trivial instructions on most CPUs.
static int log10_1(unsigned int num)
{
int ret;
static_assert(sizeof(num) == 4, "expected 32-bit unsigned int");
// extend this logic for 64 bit numbers
if (num >= 10000)
{
if (num >= 1000000)
{
if (num >= 100000000)
ret = (num >= 1000000000) ? 10 : 9;
else
ret = (num >= 10000000) ? 8 : 7;
}
else
ret = (num >= 100000) ? 6 : 5;
}
else
{
if (num >= 100)
ret = num >= 1000 ? 4 : 3;
else
ret = num >= 10 ? 2 : 1;
}
return ret;
}
// write string representation of number `n` into buf and return pointer to rterminating null
char* to_str(char* buf, unsigned int n)
{
static const char dig_[] = "0001020304050607080910111213141516171819"
"20212223242526272829303132333435363738394041424344454647484950515253545556575859"
"60616263646566676869707172737475767778798081828384858687888990919293949596979899";
int len = log10_1(n);
char *p = buf + len;
*p-- = 0;
while (n >= 100)
{
unsigned int x = (n % 100) * 2;
n /= 100;
*p-- = dig_[x + 1];
*p-- = dig_[x];
}
if (n >= 10)
{
unsigned int x = n * 2;
*p-- = dig_[x + 1];
*p-- = dig_[x];
}
else
*p-- = (char)('0' + n);
return buf + len;
}
// write string representation of number `n` into buf and return pointer to terminating null
char* to_str(char* buf, int n)
{
unsigned int l;
if (n < 0)
{
*buf++ = '-';
if (n == INT_MIN)
{
static_assert(sizeof(n) == 4, "expected 32-bit int");
memcpy(buf, "2147483648", 10);
return buf + 10;
}
l = (unsigned int)(-n);
}
else
l = (unsigned int)n;
return to_str(buf, l);
}
to_str is more than twice as fast compared to cdhowie's foo and about 6 times as fast as sprintf. Compare times:
foo time: 745 ms
to_str time: 327 ms
sprintf time: 1989 ms
There is already a good stackoverflow page for optimized to_string function: C++ performance challenge: integer to std::string conversion. Fastest algorithm is essentially identical to mine.
Upvotes: 1
Reputation: 169038
Perhaps the simplest implementation prior to C++17 would be:
std::strcpy(buffer, std::to_string(value).c_str());
This does require a temporary buffer (a temporary std::string
) but I would be hesitant to prematurely optimize. In my opinion, this would be the most elegant way to do the conversion -- it's simple and easily understandable.
(Note that it's not possible with your function signature to ensure that the buffer
pointer points to an allocation large enough to hold the stringified value.)
In C++17, you can just use std::to_chars()
(you will have to add the null terminating character yourself with this function; the documentation does not state that it adds one for you).
Perhaps there is middle ground where you can declare a trait to obtain the printf-style format specifier for each numeric type?
#include <cstdio>
template <typename T>
struct printf_specifier
{
static char const * const value;
};
template<> char const * const printf_specifier<char>::value = "%hhd";
template<> char const * const printf_specifier<unsigned char>::value = "%hhu";
template<> char const * const printf_specifier<short>::value = "%hd";
template<> char const * const printf_specifier<unsigned short>::value = "%hu";
template<> char const * const printf_specifier<int>::value = "%d";
template<> char const * const printf_specifier<unsigned int>::value = "%u";
template<> char const * const printf_specifier<long>::value = "%ld";
template<> char const * const printf_specifier<unsigned long>::value = "%lu";
template<> char const * const printf_specifier<long long>::value = "%lld";
template<> char const * const printf_specifier<unsigned long long>::value = "%llu";
template<> char const * const printf_specifier<float>::value = "%f";
template<> char const * const printf_specifier<double>::value = "%f";
template <typename T>
void foo(char *buffer, T value)
{
std::sprintf(buffer, printf_specifier<T>::value, value);
}
I would, however, suggest using snprintf
since it won't overrun your buffer if you give it the number of characters it is allowed to write:
template <typename T>
int foo(char *buffer, std::size_t size, T value)
{
return std::snprintf(buffer, size, printf_specifier<T>::value, value);
}
If even this is too much bloat, you can just do the conversion entirely yourself:
#include <algorithm>
#include <cstdlib>
template <typename T>
void foo(char *buffer, T value)
{
static_assert(std::is_integral<T>::value, "Type of value must be an integral type");
if (value < 0) {
*(buffer++) = '-';
}
char *start = buffer;
while (value != 0) {
*(buffer++) = '0' + std::abs(value % 10);
value /= 10;
}
if (buffer == start) {
*(buffer++) = '0';
} else {
std::reverse(start, buffer);
}
*buffer = '\0';
}
It could be faster to use log10 to figure out how long the string will be and write it from back to front instead of writing it backwards and then reversing it, but I'll leave that option as an exercise for you if you deem it necessary.
Upvotes: 3