Cassie
Cassie

Reputation: 3099

Check table exists Spark jdbc

I am reading some data into a data frame from Microsoft SQL server using Spark JDBC. And when the table does not exist (for example, it was dropped accidentally) I get an exception: com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'TestAllData'.

I would like to create some mechanism to check first whether the table exists and only then read the data. Is there a way to do that using Spark JDBC? Because I tried using if exists construct from Ms sql server, but it does not work for querying with Spark.

Currently, my code for reading the data looks like this:

     def getDataQuery() = {
    s"(select * from TestData) as subq"
  }


def jdbcOptions(dataQuery: String, partitionColumn: String, lowerBound: String, upperBound: String, numPartitions: String) = Map[String,String](
    "driver" -> config.getString("sqlserver.db.driver"),
    "url" -> config.getString("sqlserver.db.url"),
    "user" -> config.getString("sqlserver.db.user"),
    "password" -> config.getString("sqlserver.db.password"),
    "customSchema" -> config.getString("sqlserver.db.custom_schema"),
    "dbtable" -> dataQuery,
    "partitionColumn" -> partitionColumn,
    "lowerBound" -> lowerBound,
    "upperBound" -> upperBound,
    "numPartitions" -> numPartitions
  )

    val dataDF = sparkSession
      .read
      .format("jdbc")
      .options(jdbcOptions(getDataQuery()))
      .load()

Upvotes: 2

Views: 3031

Answers (2)

Ehud Lev
Ehud Lev

Reputation: 2901

Same concept as Pablo López Gallego wrote but for Postgres

object JdbcLoader extends App{

  val finalUrl = s"jdbc:postgresql://localhost:5432/my_db?ApplicationName=test"
  val user = "user"
  val password = "pass"

  val sparkConf = new SparkConf()
  sparkConf.setMaster(s"local[2]")
  val spark = SparkSession.builder().config(sparkConf).getOrCreate()


  def loadTable(tableName:String ): DataFrame ={

    val opts: Map[String, String] = Map(
      "url" -> finalUrl,
      "user" -> user,
      "password" -> password,
      "dbtable" -> tableName
    )

    spark.sqlContext.
      read.
      format("jdbc").
      options(opts).
      load
  }

  def checkIfTableExists(tableName: String) : Boolean = {

    var schema = "public"
    var table = tableName
    if (tableName.contains(".")){
      val schemaAndTable = tableName.split("\\.")
      schema = schemaAndTable.head
      table = schemaAndTable.last
    }

    val tableExistQ = s"(SELECT table_name FROM information_schema.tables WHERE table_schema='${schema}'" +
      s" AND table_type='BASE TABLE' and table_name = '${table}') as FOO"

    val df = loadTable(tableExistQ)
    df.count() > 0

  }


  println(checkIfTableExists("my_schema.users"))

}

Upvotes: 2

You can check with a Query and if:

def tableExist() = {
  s"show tables in default"
}

val existDF = sparkSession
  .read
  .format("jdbc")
  .options(jdbcOptions(tableExist()))
  .load()

val dataDF = if (existDF.select("tableName").collect().map(_ (0)).contains("TestData"))
  sparkSession
    .read
    .format("jdbc")
    .options(jdbcOptions(getDataQuery()))
    .load()

Upvotes: 2

Related Questions