Breach
Breach

Reputation: 1328

exception with Hive long create table statement

I have a "very long" create external table" statement that i try to run in Hive (200+ columns) but I end up with this error message.

Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)

It's suppose to create an external table over an already populated hbase table. If reduce the number of column in my Hive statement it works.

So could it be the max number of column?, a connection timeout? , the lenght of the statement?

Please share your thought.

Regards, Breach

Upvotes: 0

Views: 2017

Answers (2)

Hamdi Charef
Hamdi Charef

Reputation: 649

Change the type of column "PARAM_VALUE" in "SERDE_PARAMS" Table in metastore database.

Try this command if you are using mysql server for storing the metastore DB

  • ALTER TABLE SERDE_PARAMS MODIFY PARAM_VALUE TEXT NOT NULL;

Hope it works for you.

Upvotes: 0

JJFord3
JJFord3

Reputation: 1985

Not sure if the number of variables is the real problem given the limited information provided, but this post should be able to help you check if the number of variables is the problem.

Creating a hive table with ~40K columns

Upvotes: 1

Related Questions