Sharon Ben Asher
Sharon Ben Asher

Reputation: 14383

Hadoop Hive: create external table with dynamic location

I am trying to create a Hive external table that points to an S3 output file.
The file name should reflect the current date (it is always a new file).

I tried this:

CREATE EXTERNAL TABLE s3_export (...)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LOCATION concat('s3://BlobStore/Exports/Daily_', from_unixtime(unix_STRING(),'yyyy-MM-dd'));

but I get an error:
FAILED: Parse Error: line 3:9 mismatched input 'concat' expecting StringLiteral near 'LOCATION' in table location specification

is there any way to dynamically specify table location?

Upvotes: 3

Views: 6748

Answers (2)

Ducaz035
Ducaz035

Reputation: 3132

This function doesn't work at my side , how did you make this happen ?

hive -d s3file=s3://BlobStore/Exports/APKsCollection_test/`date +%F`/

Upvotes: -1

Sharon Ben Asher
Sharon Ben Asher

Reputation: 14383

OK, I found the hive variables feature. So I pass the location in the cli as follows

hive -d s3file=s3://BlobStore/Exports/APKsCollection_test/`date +%F`/

and then use the variable in the hive command

CREATE EXTERNAL TABLE s3_export (...)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LOCATION '${s3File}';

Upvotes: 8

Related Questions