ringggg
ringggg

Reputation: 21

Issue with character data type length in R & decimal precision

Trying to create a function to get the precision of numeric data. (the number of digits to the right of the decimal place)

    decimalplaces <- function(x) {
            if (x %% 1 != 0) {
                    pattern <- "^([0-9]+)[.]([0-9]+)$"
                    dec_part <- gsub(pattern,"\\2", x)            
                    nchar(dec_part)
            } else { 
                    return(0) 
        }
    }

The issue occurs with values with more than 16 digits -- nchar coerces "dec_part" to a string which can only store 16 digits.

Is there a way to overcome this limitation in R?

Are there alternatives to nchar for numeric data?

(R version 3.1.1 64 bit)

Upvotes: 2

Views: 517

Answers (1)

Marat Talipov
Marat Talipov

Reputation: 13304

The 'problem' is not in nchar but in gsub, which applies as.character to a non-character x. The documentation for as.character says:

as.character represents real and complex numbers to 15 significant digits (technically the compiler's setting of the ISO C constant DBL_DIG, which will be 15 on machines supporting IEC60559 arithmetic according to the C99 standard). This ensures that all the digits in the result will be reliable (and not the result of representation error), but does mean that conversion to character and back to numeric may change the number. If you want to convert numbers to character with the maximum possible precision, use format.

So, you can use

dec_part <- gsub(pattern,"\\2", format(x,digits=22))    

instead of

dec_part <- gsub(pattern,"\\2", x)

in your code, but be careful because the 15 significant digit limit was set for a good reason, so there is a good chance to find just noise in trailing numbers. For example,

> format(1/3,digits=22)
[1] "0.3333333333333333148296"

Upvotes: 2

Related Questions