Shruthi Br
Shruthi Br

Reputation: 1

Spark JDBC Write to Teradata: multiple spark tasks failing with Transaction ABORTed due to deadlock error resulting in Stage failure

I am using spark JDBC write to load data from hive to teradata view. I am using 200 vcores and partitioned the data into 10000 partitions.

Spark tasks are failing with the below error resulting in stage failure. Sometimes the application finishes successfully but with some duplicate records

caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 16.20.00.10] [Error 2631] [SQLState 40001] Transaction ABORTed due to deadlock.

Below is the code I have used:

val df = spark.sql("select * from hive table").distinct.repartition(10000).write.mode(overwrite) .option("truncate", Truncate).jdbc(url,dbTable, dproperties)

Teradata view is created with "AS LOCKING ROW FOR ACCESS". The table also has a unique PI.

I am unable to figure out why some spark tasks are failing with dead lock error and is there a way I can stop my entire spark application from failing because of the task failures.

Upvotes: 0

Views: 943

Answers (1)

Tom Nolan
Tom Nolan

Reputation: 450

Dozens of sessions trying to insert into the same table will likely cause a deadlock. Even though the view is defined with an access lock, a write lock must be obtained in order to insert rows into the backing table.

Upvotes: 0

Related Questions