Reputation: 463
I know that we can read and write data from existing mysql tables using spark jdbc. But can we even create mysql table and insert data into it using dataframes? When i try to load file into dataframe and try to write to non-existing table i'm facing null pointer exception.Following is the error:
java.lang.NullPointerException at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:99) at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:469) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50)
Thought table will be created if does not exist. Please let me know if we can create a mysql table and load a dataframe content into it? I'm using Spark 2.1.0 version. Thanks in advance
Upvotes: 2
Views: 1081
Reputation: 3547
The NullPointerException happends when Spark tries to close the connection. There is a NullPointerException, because the connection is null. The reason of that could be that some connection parameters are null.
Upvotes: 2