User12345
User12345

Reputation: 5480

Write DataFrame to mysql table using pySpark

I am attempting to insert records into a MySql table. The table contains id and name as columns.

I am doing like below in a pyspark shell.

name = 'tester_1'
id = '103'  
import pandas as pd
l = [id,name]

df = pd.DataFrame([l])

df.write.format('jdbc').options(
      url='jdbc:mysql://localhost/database_name',
      driver='com.mysql.jdbc.Driver',
      dbtable='DestinationTableName',
      user='your_user_name',
      password='your_password').mode('append').save()

I am getting the below attribute error

AttributeError: 'DataFrame' object has no attribute 'write'

What am I doing wrong? What is the correct method to insert records into a MySql table from pySpark

Upvotes: 16

Views: 30477

Answers (2)

vegetarianCoder
vegetarianCoder

Reputation: 2978

Just to add @mrsrinivas answer's.

Make sure that you have jar location of sql connector available in your spark session. This code helps:

spark = SparkSession\
    .builder\
    .config("spark.jars", "/Users/coder/Downloads/mysql-connector-java-8.0.22.jar")\
    .master("local[*]")\
    .appName("pivot and unpivot")\
    .getOrCreate()

otherwise it will throw an error.

Upvotes: 0

mrsrinivas
mrsrinivas

Reputation: 35444

Use Spark DataFrame instead of pandas', as .write is available on Spark Dataframe only

So the final code could be

data =['103', 'tester_1']

df = sc.parallelize(data).toDF(['id', 'name'])

df.write.format('jdbc').options(
      url='jdbc:mysql://localhost/database_name',
      driver='com.mysql.jdbc.Driver',
      dbtable='DestinationTableName',
      user='your_user_name',
      password='your_password').mode('append').save()

Upvotes: 20

Related Questions