Reputation: 591
I am using scala 2.13 , spark 3.3.0 and mssql latest spark connector that is "com.microsoft.azure" % "spark-mssql-connector_2.12" % "1.3.0-BETA" here I am inserting data into mssql
df.write
.format("com.microsoft.sqlserver.jdbc.spark")
.mode(SaveMode.Append)
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
.option("url", jdbcUrl)
.option("dbtable", "mytable")
.option("tableLock", value = false)
.option("schemaCheckEnabled", value = false)
.save()
getting exception
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.Map.$plus(Lscala/Tuple2;)Lscala/collection/immutable/Map;
at com.microsoft.sqlserver.jdbc.spark.DefaultSource.createRelation(DefaultSource.scala:55)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
same code works if I reduce scala version from 2.13 to 2.12 , currently 2.13 spark-mssql connector is not available , any workaround to get rid off this error using scala 2.13 version
Upvotes: 0
Views: 427
Reputation: 3581
The short answer is that, in Scala 2, you can't use dependencies that weren't compiled for the same scala minor version. They are not binary compatible
Binary Compatibility
In
Scala 2
different minor versions of the compiler were free to change the way how they encode different language features in JVM bytecode so each bump of the compiler's minor version resulted in breaking binary compatibility and if a project had any Scala dependencies they all needed to be (cross-)compiled to the same minor Scala version that was used in that project itself. On the contrary, Scala 3 has a stable encoding into JVM bytecode.
As you can see in the artifactory repository of spark-mssql-connector, today there is no release published that is compatible with scala 2.13
.
As you can see in the github repo of sql-spark-connector, there is an open issue titled Need support for scala version 2.13 opened on May 2, 2023 with no activity yet.
The Support section in the README of the project says:
Support
The Apache Spark Connector for Azure SQL and SQL Server is an open source project. This connector does not come with any Microsoft support. For issues with or questions about the connector, please create an Issue in this project repository. The connector community is active and monitoring submissions.
If it's mandatory to the connector compatible with scala 2.13
, you can do the upgrade by yourself. You can wait until the upgrade is done, but you don't have any guarantee of when that will happen.
Upvotes: 1
Reputation: 2851
You can't have a different Scala version than your Spark runtime - they must be the same major (and sometimes minor if there are bugs).
Is there a reason why using the 2.12, which works, isn't enough?
Upvotes: 0