Reputation: 79
I'm trying to sqoop some tables from Postgres to Hive and convert them to ORC while sqooping.
sqoop import --hcatalog-home /usr/hdp/current/hive-webhcat --connect jdbc:postgresql:/// --username user --password pwd --table table --hcatalog-database default --hcatalog-table table --hcatalog-storage-stanza "stored as orcfile" --create-hcatalog-table -m 1
I was getting the below error:
ERROR tool.ImportTool: Import failed: org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store into a transactional table default.agent from Pig/Mapreduce is not supported
sqoop import --hcatalog-home /usr/hdp/current/hive-webhcat --connect jdbc:postgresql:/// --username user --password pwd --table table --hcatalog-database default --hcatalog-table table --hcatalog-storage-stanza "stored as orcfile tblproperties ("transactional"="false")" --create-hcatalog-table -m 1
Error:
INFO hcat.SqoopHCatUtilities: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table default.duplicates failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional.)
sqoop import --hcatalog-home /usr/hdp/current/hive-webhcat --connect jdbc:postgresql:/// --username user --password pwd --table table --hcatalog-database default --hcatalog-table table --hcatalog-storage-stanza "stored as orcfile tblproperties ("transactional"="false")" --create-hcatalog-table --hcatalog-external-table -m 1
and
sqoop import --hcatalog-home /usr/hdp/current/hive-webhcat --connect jdbc:postgresql:/// --username user --password pwd --table table --hcatalog-database default --hcatalog-external-table table --hcatalog-storage-stanza "stored as orcfile tblproperties ("transactional"="false")" --create-hcatalog-table -m 1
I tried both given above,
This is the Error:
ERROR tool.BaseSqoopTool: Unrecognized argument: --hcatalog-external-table
Now, I'm stuck.. any help is appreciated.
Upvotes: 0
Views: 225
Reputation: 1
it works! may help you Arguments : import --connect jdbc:postgresql://****:5432/cloudoffice_2 --username dbmanager --password-file ******** --table hive_ua_test_with_column_5 --hcatalog-database temp --hcatalog-table ua_test_adb --split-by id -m 4 --hcatalog-storage-stanza stored as orc tblproperties ("orc.compress"="SNAPPY") --verbose --where created>=1661356800 and created<1661443200 --hive-partition-key date --hive-partition-value 2022-08-25 --columns id,ip_address
Upvotes: 0