Reputation: 35
I have the following RDD with thousands of entries as (Int, Double) which I would like to add a column as a timestamp, such that it becomes (Int, Double, Datetime). I tried the following:
val addTimeStampRDD = OriginalRDD.map {
case(a, b) =>
(a, b, current_timestamp())}
Unfortunately the job to fail with errors such as:
java.lang.NoClassDefFoundError: scala/Product$class
Is this because the timestamp is a SQL function? Is there a better alternative than this?
Upvotes: 0
Views: 268
Reputation: 426
To achieve the result in Spark you can use LocalDateTime
from java.time.LocalDateTime
You can follow this question
scala> import java.time.LocalDateTime
import java.time.LocalDateTime
scala> LocalDateTime.now()
You can also use DateTimeFormatter
to format the date in required format
scala> import java.time.format.DateTimeFormatter
import java.time.format.DateTimeFormatter
scala> DateTimeFormatter.ofPattern("yyyy-MM-dd_HH:mm").format(LocalDateTime.now)
Upvotes: 1