Reputation: 133
I'm trying to import a very large data that includes timestamp formatted as "2015-08-31 07:23:25.3" The table has almost 2.5mil rows and when I import it into the dataframe, it loses time.
table1 <- dbReadTable(db_connection, "table_name")
Above is the code I have used. It works fine with another table with the same format timestamp and has 800 rows.
I have used following code to check if anything is hidden... no luck :-(
unclass(as.POSIXlt(table1$timestamp))
Please help!
Upvotes: 0
Views: 150
Reputation: 368429
Well consider this:
R> library(anytime)
R> anytime("2015-08-31 07:23:25.3")
[1] "2015-08-31 07:23:25.2 CDT"
R>
Standard rounding error not entirely uncommon on numeric data such as dates. But are you very likely making two mistakes:
Witness this:
R> options("digits.secs"=0); Sys.time()
[1] "2017-02-12 19:47:49 CST"
R> options("digits.secs"=6); Sys.time()
[1] "2017-02-12 19:47:53.378328 CST"
R>
Time is always at about microsecond precision with R but your default display may be suppressing subseconds. They are always there:
R> now <- Sys.time()
R> now - trunc(now)
Time difference of 0.722091 secs
R>
I was the original mentor for the RPostgreSQL project during its Google Summer of Code. Postgresql has native datetimes, and so does R. Never ever import times as characters. The parsing issue disappear if you properly typecast your query.
Upvotes: 1