Reputation: 297
My Spark Scala code is failing due to Spark out of memory issue. I am running the code from ADF pipeline. In Databricks cluster, the executor memory is set to 4g. I want to change this value at ADF level instead of changing it at cluster level. While creating a linked service we have additional cluster settings where we can define the cluster spark configuration. Please find the below. Could someone please let me know how to set the spark executor memory in linked service in ADF. Thank you.
Upvotes: 0
Views: 1360
Reputation: 3240
Add Name = spark.executor.memory
and Value = 6g
Monitor core configuration settings to ensure your Spark jobs run in a predictable and performant way. These settings help determine the best Spark cluster configuration for your particular workloads.
Also refer - https://learn.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-settings
Upvotes: 1