Reputation: 93
I'm working on Apache spark project on eclipse using Scala
I would like to change my date format from yyyy-mm-dd
to dd-mm-yyyy
This is my code:
val conf = new SparkConf().setMaster("local").setAppName("trying")
val sc = new SparkContext(conf)
val x =
sc.textFile("/home/amel/1MB")
.filter(!_.contains("NULL")).filter(!_.contains("Null"))
val re = x.map(row => {
val cols = row.split(",")
val Cycle = cols(2)
val Duration = Cycle match {
case "Licence" => "3 years"
case "Master" => "2 years"
case "Ingéniorat" => "5 years"
case "Ingeniorat" => "5 years"
case "Doctorat" => "3 years"
case _ => "NULL" }
(cols(0)+","+cols(1) + "," + Cycle + "," + cols(3) + ","
+Duration)
})
re.collect.foreach(println)
This is an example of the result I got:
0000023497,2007-06-27,Master,SI,2 years
This is what I want my result to look like
0000023497,27-06-2007,Master,SI,2 years
Upvotes: 3
Views: 1689
Reputation: 1758
Use org.apache.spark.sql.functions.date_format
function
Example:
scala> df.show
+----------+
| date|
+----------+
|2019-06-25|
|2019-06-26|
|2019-06-27|
+----------+
scala> df.withColumn("date2", org.apache.spark.sql.functions.date_format($"date", "dd-MM-yyyy")).show
+----------+----------+
| date| date2|
+----------+----------+
|2019-06-25|25-06-2019|
|2019-06-26|26-06-2019|
|2019-06-27|27-06-2019|
+----------+----------+
Upvotes: 2
Reputation: 61716
Here is a Scala 2.13
alternative via pattern matching by unapplying a string interpolator:
"2007-06-27" match { case s"$year-$month-$day" => s"$day-$month-$year" }
// "27-06-2007"
Upvotes: 1
Reputation: 51271
Can be done with regex.
val ymd = raw"(\d+)-(\d+)-(\d+)".r
ymd.replaceAllIn("2007-06-27", m => s"${m group 3}-${m group 2}-${m group 1}")
//res0: String = 27-06-2007
Can also be done via java.time
library formatting.
import java.time.LocalDate
import java.time.format.DateTimeFormatter
LocalDate.parse("2019-01-04")
.format(DateTimeFormatter.ofPattern("dd-MM-yyyy"))
//res1: String = 04-01-2019
Upvotes: 3