Marcin
Marcin

Reputation: 49816

Spark: com.mysql.jdbc.Driver does not allow create table as select

I'm getting the following error when attempting to save to a MySQL database through spark:

Py4JJavaError: An error occurred while calling o41.saveAsTable.
: java.lang.RuntimeException: com.mysql.jdbc.Driver does not allow create table as select.
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:242)
    at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:218)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1091)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:745)

My python code:

res.saveAsTable(tableName='test.provider_phones', source='com.mysql.jdbc.Driver',driver='com.mysql.jdbc.Driver', mode='append', url='jdbc:mysql://host.amazonaws.com:port/test?user=user&password=pass')

This happens whether or not the table already exists.

I am using spark 1.3.1

Upvotes: 4

Views: 4187

Answers (2)

Marcin
Marcin

Reputation: 49816

Unfortunately, this is not possible in pyspark 1.3.1. My solution is to switch over to scala, and use DataFrame.insertIntoJDBC

Upvotes: 1

Gourav
Gourav

Reputation: 1265

you can use createJDBCTable(url: String, table: String, allowExisting: Boolean) or insertIntoJDBC(url: String, table: String, overwrite: Boolean) function of DataFrame.

http://www.sparkexpert.com/2015/04/17/save-apache-spark-dataframe-to-database/

Upvotes: 1

Related Questions