Reputation: 15
I am new to Apache Spark and I am using Scala and Mongodb to learn it. https://docs.mongodb.com/spark-connector/current/scala-api/ I am trying to read the RDD from my MongoDB database, my notebook script as below:
import com.mongodb.spark.config._
import com.mongodb.spark._
val readConfig = ReadConfig(Map("uri" -> "mongodb+srv://root:[email protected]/test_database.test_collection?retryWrites=true&w=majority"))
val testRDD = MongoSpark.load(sc, readConfig)
print(testRDD.collect)
At the print(testRDD.collect) line, I got this error:
java.lang.NoSuchMethodError: com.mongodb.internal.connection.Cluster.selectServer(Lcom/mongodb/selector/ServerSelector;)Lcom/mongodb/internal/connection/Server;
And more than 10 lines "at..."
Used libraries:
org.mongodb.spark:mongo-spark-connector_2.12:3.0.1
org.mongodb.scala:mongo-scala-driver_2.12:4.2.3
Is this the problem from Mongodb internal libraries or how could I fix it?
Many thanks
Upvotes: 1
Views: 671
Reputation: 1
I was also facing the same problem, solved it using mongo-spark-connector-2.12:3.0.1
jar and with that also added jar of Scalaj HTTP » 2.4.2
. It's working fine now.
Upvotes: 0
Reputation: 87154
I suspect that there is a conflict between mongo-spark-connector
and mongo-scala-driver
. The former is using Mongo driver 4.0.5, but the later is based on the version 4.2.3. I would recommend to try only with mongo-spark-connector
Upvotes: 1