Reputation: 1073
I tried below things:
import org.apache.spark.SparkException
from org.apache.spark.SparkException import SparkException
from org.apache.spark import SparkException
These all give ModuleNotFoundError: No module named 'org.apache.spark.SparkException'.
I need to handle PySpark exceptions in Azure Synapse like below:
except (Py4JJavaError, SparkException, TypeError) as e:
print(e)
Upvotes: 2
Views: 763
Reputation: 2913
Since Spark 3.5 we can import it from pyspark.errors.PySparkException
- source.
In previous versions you will have to use the general broad Exception class. But if you are expecting the exception within Pytest, you can also do the following:
with pytest.raises(Exception, match="SparkException"):
df.collect()
Upvotes: 1
Reputation: 1495
org.apache.spark.SparkException
is the scala exception thrown in JVM process, you can't and don't need to handle this in pyspark
Upvotes: 1