Reputation: 156
My Azure Synapse Notebooks are completing successfully but notebook runs show up as Failed in the Spark application view, even though the pipeline considers them successful and does the appropriate work after.
I'm getting the following error indicated below:
Error details
This application failed due to the total number of errors: 1.
Error code 1
EXCEPTION_DURING_SPARK_JOB_CLEANUP
Message
[plugins.mooboo9-synapse.mooboo9cluster.479 WorkspaceType:<Synapse> CCID:<dd232b79-7565-4825-bf5c-af847424a079>]. [Cleanup] -> [Ended] JobResult=[Cancelled] LivyJobState=[idle]. Unable to kill livy job.
Source
Dependency
I'm assuming the cause of this is "Unable to kill livy job." but what causes this and how do I fix it? I don't even know where to start. I've seen similar errors on Stack Overflow but not this exact one.
When I look in the Livy Logs I also see these errors:
22/10/20 22:21:59 WARN DependencyUtils: Local jar /opt/livy/rsc-jars/netty-all-4.1.17.Final.jar does not exist, skipping.
22/10/20 22:21:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Upvotes: 0
Views: 1125
Reputation: 156
Apparently, the "SparkMetricsListener" is not dehooking from the application upon completion. The root cause of this is unknown.
This issue causes the cluster to timeout and shutdown based on idle, then subsequently report an error in shutting down due to this listener. The following configuration block for the notebooks prevents the issue for the time being:
%%configure -f
{
"conf":
{
"livy.rsc.synapse.spark-config-merge-rule.enabled": "false",
"spark.extraListeners": "",
}
}
When you put the above config. in a notebook cell do not add comments or anything else; otherwise it will fail.
Upvotes: 1