Reputation: 1190
Sorry for the stupid question here, but I'm trying to change the configuration of the spark connector, specifically the Mongo Spark connector (Need to change the connector from hitting our Prod server to our secondary). How do I access the SparkConf to make those changes? I'm using Databricks and Python, v2.1
I'm looking at https://docs.mongodb.com/spark-connector/master/configuration/ but I'm confused where I even type this? Locally? In a notebook? Into the command line? I've tried those and haven't had any success. If anyone has a practical step by step breakdown, that would be amazing.
Upvotes: 1
Views: 1259
Reputation: 330063
If you want to provide configuration using Databricks dashboard:
Open Clusters panel.
Click on Create Cluster.
Open Spark tab and specify desired options:
You should use the same format as for configuration files, including prefix:
spark.mongodb.input.uri mongodb://host:port/
Reference:
Upvotes: 2