khusnanadia
khusnanadia

Reputation: 853

How to convert epoch to datetime redshift?

I work in dbeaver. I have a table x.

TABLE x has a column "timestamp"

1464800406459 
1464800400452 
1464800414056 
1464800422854 
1464800411797

The result I want:

Wed, 01 Jun 2016 17:00:06.459 GMT
Wed, 01 Jun 2016 17:00:00.452 GMT
Wed, 01 Jun 2016 17:00:14.056 GMT
Wed, 01 Jun 2016 17:00:22.854 GMT 
Wed, 01 Jun 2016 17:00:11.797 GMT 

I tried redshift query

SELECT FROM_UNIXTIME(x.timestamp) as x_date_time 
FROM x

but didn't work.

Error occurred:

Invalid operation: function from_unixtime(character varying) does not exist

I also tried

SELECT DATE_FORMAT(x.timestamp, '%d/%m/%Y') as x_date 
FROM x

Error occurred:

Invalid operation: function date_format(character varying, "unknown") does not exist

Is there any wrong with the syntax? Or is there another way to convert to human readable date and time?

Upvotes: 52

Views: 113734

Answers (6)

sasikumar reddy
sasikumar reddy

Reputation: 11

SELECT TIMESTAMP 'epoch' + cast(cloumn_name as bigint) * INTERVAL '1 Second ' 
  AS cloumn_name

Upvotes: 1

s7vr
s7vr

Reputation: 75994

Redshift doesn't have the from_unixtime() function. You'll need to use the below SQL query to get the timestamp. It just adds the number of seconds to epoch and return as timestamp.

select timestamp 'epoch' + your_timestamp_column * interval '1 second' AS your_column_alias
from your_table

Upvotes: 113

Biswajeet Praharaj
Biswajeet Praharaj

Reputation: 67

I used it like this

CAST(DATEADD(S, CONVERT(int,LEFT(column_name, 10)), '1970-01-01')as timestamp) as column_name

SELECT 
    ,task_id 
    ,CAST(DATEADD(S, CONVERT(int,LEFT(SLA, 10)), '1970-01-01')as timestamp) as SLA
FROM my_schema.my_task_table ;

Upvotes: 0

user2357169
user2357169

Reputation: 109

For quick reference, here is the SQL UDF implementation of the from_unixtime function shown above in Python. I've not tested the performance but I imagine it would be similar to the plain SQL version. It's a whole lot easier to write though.

Note: this calculates the number of seconds from the epoch.

CREATE FUNCTION from_unixtime (BIGINT)
  RETURNS TIMESTAMP WITHOUT TIME ZONE
IMMUTABLE
as $$
  SELECT TIMESTAMP 'epoch' + $1 / 1000 * interval '1 second'
$$ LANGUAGE sql;

Upvotes: 1

Gaurav
Gaurav

Reputation: 1175

UDF is going to be pretty slow. Checked execution time for 3 solutions and 1k rows.

The slowest -

-- using UDF from one of the answers
SELECT from_unixtime(column_with_time_in_ms/ 1000)
FROM table_name LIMIT 1000;

Execution time: 00:00:02.348062s

2nd best -

SELECT date_add('ms',column_with_time_in_ms,'1970-01-01')
FROM table_name LIMIT 1000;

Execution time: 00:00:01.112831s

And the fastest -

SELECT TIMESTAMP 'epoch' + column_with_time_in_ms/1000 *INTERVAL '1 second'
FROM table_name LIMIT 1000;

Execution time: 00:00:00.095102s


Execution time calculated from stl_query -

SELECT *
       ,endtime - starttime
FROM stl_query
WHERE querytxt ilike('%table_name%limit%')
ORDER BY starttime DESC;

Upvotes: 38

aloschilov
aloschilov

Reputation: 576

The simplest solution is to create from_unixtime() function:

CREATE OR REPLACE FUNCTION from_unixtime(epoch BIGINT)
  RETURNS TIMESTAMP  AS
'import datetime

return datetime.datetime.fromtimestamp(epoch)
'
LANGUAGE plpythonu IMMUTABLE;

See Redshift documentation on UDF for details

Upvotes: 19

Related Questions