Georg Heiler
Georg Heiler

Reputation: 17676

How can spark write (create) a table in hive as external in HDP 3.1

The default

spark-shell --conf spark.hadoop.metastore.catalog.default=hive
val df:Dataframe = ...
df.write.saveAsTable("db.table") 

fails as it tries to write a internal / managed / transactional table (see How to write a table to hive from spark without using the warehouse connector in HDP 3.1).

How can I tell spark to not create a managed, but rather an external table?

Upvotes: 0

Views: 1218

Answers (1)

Georg Heiler
Georg Heiler

Reputation: 17676

For now disabling transactional tables by default looks like the best option to me.

Inside Ambari simply disabling the option of creating transactional tables by default solves my problem.

set to false twice (tez, llap)

hive.strict.managed.tables = false

and enable manually in each table property if desired (to use a transactional table).

As a workaround using the manual CTAS could be an option as well.

Upvotes: 1

Related Questions