Santi Peñate-Vera
Santi Peñate-Vera

Reputation: 1186

Spark + MySQL: no spark.read

I just downloaded Spark 2.2 from the website, and created a simple project with the example from here.

The code is this:

import java.util.Properties

import org.apache.spark


object MysqlTest {

  def main(args: Array[String]) {

    val jdbcDF = spark.read
                      .format("jdbc")
                      .option("url", "jdbc:mysql://localhost/hap")
                      .option("dbtable", "hap.users")
                      .option("user", "***")
                      .option("password", "***")
                      .load()

  }

}

The problem is that apparently spark.read does not exist.

I guess the Spark API's documentation is not up to date and the examples do not work. I would appreciate a working example.

Upvotes: 0

Views: 151

Answers (2)

puhlen
puhlen

Reputation: 8519

The docs should be correct, but you skipped over the strt where the initialization is explained.https://spark.apache.org/docs/latest/sql-programming-guide.html#starting-point-sparksession

The convention whith spark docs is the spark is a SparkSession instance, so that needs to be created first. You do this with the SparkSessionBuilder.

val spark = SparkSession
  .builder()
  .appName("Spark SQL basic example")
  .config("spark.some.config.option", "some-value")
  .getOrCreate()

 // For implicit conversions like converting RDDs to DataFrames
 import spark.implicits._

Upvotes: 0

tricky
tricky

Reputation: 1553

I think you need this :

import org.apache.spark.sql.SparkSession

val spark = SparkSession
  .builder()
  .appName("Yo bro")
  .getOrCreate()

Upvotes: 2

Related Questions