amggg013
amggg013

Reputation: 91

PySpark get minute only?

In snowflake, you can do something like:

SELECT
my_event_time,
DATE_TRUNC('minute',my_event_time)::TIME AS minute
FROM table

And it would return something like:

my_event_time             | minute
-------------------------------------
2020-08-17 13:23:49.227.  | 13:23:00

Removing everything except the actual minute, can this be done in Pyspark df? The date_trunc('minute', ...) in Pyspark does something else, it doesnt remove the date part.

Upvotes: 0

Views: 139

Answers (2)

Learn Hadoop
Learn Hadoop

Reputation: 3060

Try this

spark.sql("current_timestamp,minute(current_timestamp)").show()

Upvotes: 0

Mohana B C
Mohana B C

Reputation: 5487

Use date_format function and pass required time format.

spark.sql("select date_format(current_timestamp,'HH:mm:ss') time").show()

+--------+
|    time|
+--------+
|10:48:13|
+--------+

Upvotes: 1

Related Questions