Reputation: 25
Relatively simple question I feel. Attempting to convert an integer column into epoch time (MM/DD/YYY)?
e.g., convert 881250949 --> 12/04/1997
Any advice?
Upvotes: 0
Views: 2196
Reputation: 5487
Using from_unixtime and date_format function, we can achieve the required result:
SPARK_SCALA
val spark = SparkSession.builder().master("local[*]").getOrCreate()
import spark.implicits._
import org.apache.spark.sql.functions._
spark.sparkContext.setLogLevel("ERROR")
// Sample dataframe
val df = Seq(881250949).toDF("col")
df.withColumn("col", date_format(from_unixtime('col), "MM/dd/yyyy"))
.show(false)
+----------+
|col |
+----------+
|12/04/1997|
+----------+
PYSPARK
from pyspark.sql import *
from pyspark.sql.functions import *
spark = SparkSession.builder.master("local").getOrCreate()
# Sample dataframe
df = spark.createDataFrame([(1,881250949)], "id int, date int")
df.withColumn("date", date_format(from_unixtime("date"), "MM/dd/yyyy"))\
.show()
/*
+---+----------+
| id| date|
+---+----------+
| 1|12/04/1997|
+---+----------+
*/
Upvotes: 1