Ashley O
Ashley O

Reputation: 1190

Databricks Spark Conf

Sorry for the stupid question here, but I'm trying to change the configuration of the spark connector, specifically the Mongo Spark connector (Need to change the connector from hitting our Prod server to our secondary). How do I access the SparkConf to make those changes? I'm using Databricks and Python, v2.1

I'm looking at https://docs.mongodb.com/spark-connector/master/configuration/ but I'm confused where I even type this? Locally? In a notebook? Into the command line? I've tried those and haven't had any success. If anyone has a practical step by step breakdown, that would be amazing.

Upvotes: 1

Views: 1259

Answers (1)

zero323
zero323

Reputation: 330063

If you want to provide configuration using Databricks dashboard:

  • Go to your Databricks dashboard.
  • Open Clusters panel.

    enter image description here

  • Click on Create Cluster.

    enter image description here

  • Open Spark tab and specify desired options:

    enter image description here

  • You should use the same format as for configuration files, including prefix:

    spark.mongodb.input.uri  mongodb://host:port/ 
    

Reference:

Upvotes: 2

Related Questions