SanjanaSanju
SanjanaSanju

Reputation: 297

How to set spark executor memory in the Azure Data Factory Linked service

My Spark Scala code is failing due to Spark out of memory issue. I am running the code from ADF pipeline. In Databricks cluster, the executor memory is set to 4g. I want to change this value at ADF level instead of changing it at cluster level. While creating a linked service we have additional cluster settings where we can define the cluster spark configuration. Please find the below. Could someone please let me know how to set the spark executor memory in linked service in ADF. Thank you.

enter image description here

Upvotes: 0

Views: 1360

Answers (1)

Abhishek Khandave
Abhishek Khandave

Reputation: 3240

Add Name = spark.executor.memory and Value = 6g

enter image description here

Monitor core configuration settings to ensure your Spark jobs run in a predictable and performant way. These settings help determine the best Spark cluster configuration for your particular workloads.

Also refer - https://learn.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-settings

Upvotes: 1

Related Questions