Reputation: 3587
I am parsing some dates in this format: 2009-01-23 18:15:05
using the following function
def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
val rowRdd = sqlContext.sparkContext.textFile(path).map { line =>
val tokens = line.split(',')
val dt = new DateTime(tokens(0))
Row(new Timestamp(dt.getMillis))
}
val fields = Seq(
StructField("timestamp", TimestampType, true)
)
val schema = StructType(fields)
sqlContext.createDataFrame(rowRdd, schema)
}
Spark is throwing an error:
java.lang.IllegalArgumentException: Invalid format: "2009-01-23 18:15:05" is malformed at " 18:15:05" at org.joda.time.format.DateTimeParserBucket.doParseMillis
I presume that it is due to the fact that the milliseconds are missing
Upvotes: 0
Views: 7465
Reputation: 330063
How about something like this?
import org.apache.spark.sql.functions.regexp_extract
def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
sqlContext.sparkContext.textFile(path).toDF("text").select(
regexp_extract($"text", "^(.*?),", 1).cast("timestamp").alias("timestamp"))
}
Upvotes: 4
Reputation: 702
Instead of using jodatime, you can use the following method
def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
val rowRdd = sqlContext.sparkContext.textFile(path).map { line =>
val tokens = line.split(',')
Row(getTimestamp(tokens(0)))
}
val fields = Seq(
StructField("timestamp", TimestampType, true)
)
val schema = StructType(fields)
sqlContext.createDataFrame(rowRdd, schema)
}
Use the following function to convert to timestamp.
def getTimestamp(x:String) :java.sql.Timestamp = {
val format = new SimpleDateFormat("yyyy-MM-dd hh:mm:ss")
if (x.toString() == "")
return null
else {
val d = format.parse(x.toString());
val t = new Timestamp(d.getTime());
return t
}
}
Upvotes: 3