Reputation: 39
I am using HDInishgt Spark 2.1 and in my Jupyter notebook I would like to load multiple spark packages.
%%configure -f
{ "conf": {"spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0"}}
But when I try and do
%%configure -f
{ "conf": {"spark.jars.packages": "com.microsoft.azure.cosmosdb.spark,com.databricks:spark-avro_2.11:3.2.0"}}
OR
{ "conf": {"spark.jars.packages": ["com.databricks:spark-avro_2.11:3.2.0","com.microsoft.azure.cosmosdb.spark"]
}}
I get an error. What am i doing wrong?
Upvotes: 0
Views: 2105
Reputation: 6271
Late reply, but hopefully still helpful:
%%configure -f
{ "conf": { "spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0,com.microsoft.azure:azure-cosmosdb-spark_2.3.0_2.11:1.2.2"} }
You can also add repositories the same way:
%%configure -f
{ "conf": { "spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0,com.microsoft.azure:azure-cosmosdb-spark_2.3.0_2.11:1.2.2", "spark.jars.repositories": "http://nexus.internal/repository/maven-public/"} }
Upvotes: 3
Reputation: 311
Try this
%%configure -f
{ "conf": {"spark.jars.packages": [ "com.databricks:spark-avro_2.11:3.2.0", "com.microsoft.azure.cosmosdb.spark" ] } }
Upvotes: 0