mdeonte001
mdeonte001

Reputation: 39

How to configure Jupyter %configure with multiple packages?

I am using HDInishgt Spark 2.1 and in my Jupyter notebook I would like to load multiple spark packages.

 %%configure -f
    { "conf": {"spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0"}}

But when I try and do

 %%configure -f
    { "conf": {"spark.jars.packages": "com.microsoft.azure.cosmosdb.spark,com.databricks:spark-avro_2.11:3.2.0"}}

OR

{ "conf": {"spark.jars.packages": ["com.databricks:spark-avro_2.11:3.2.0","com.microsoft.azure.cosmosdb.spark"] 
}}

I get an error. What am i doing wrong?

Upvotes: 0

Views: 2105

Answers (2)

Loki
Loki

Reputation: 6271

Late reply, but hopefully still helpful:

%%configure -f
{ "conf":  { "spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0,com.microsoft.azure:azure-cosmosdb-spark_2.3.0_2.11:1.2.2"} }

You can also add repositories the same way:

%%configure -f
{ "conf":  { "spark.jars.packages": "com.databricks:spark-avro_2.11:3.2.0,com.microsoft.azure:azure-cosmosdb-spark_2.3.0_2.11:1.2.2", "spark.jars.repositories": "http://nexus.internal/repository/maven-public/"} }

Upvotes: 3

Anthony Mattas
Anthony Mattas

Reputation: 311

Try this

%%configure -f
{ "conf": {"spark.jars.packages": [ "com.databricks:spark-avro_2.11:3.2.0", "com.microsoft.azure.cosmosdb.spark" ] } }

Upvotes: 0

Related Questions