Passing Spark Dataset as an function argument

I want to pass on Spark Dataset as an argument to a function. For example

  case class Token1(name: String, productId: Int, score: Double)
  val data1 = Seq(
    Token1("aaa", 100, 0.12),
    Token1("aaa", 200, 0.29),
    Token1("bbb", 200, 0.53),
    Token1("bbb", 300, 0.42))
  val ds1 = data1.toDS()

  case class Token2(name: String, productId: Int)
  val data2 = Seq(
    Token2("aaa", 100),
    Token2("aaa", 200),
    Token2("bbb", 200),
    Token2("bbb", 300))
  val ds2 = data2.toDS()

  def printDS(ds:Dataset[T]): Unit ={
                      ds.show()
            }
  printDS(ds1)
  printDS(ds2)

I want to pass different Datasets to printDS(). Since Spark Datasets are strongly typed how to pass Dataset[Token1] or Dataset[Token2] to the printDS() which accepts Dataset[Any]?. I am able pass on Spark Dataframes as an function argument, not Spark Dataset.

Upvotes: 1

Views: 4368

Answers (1)

Shyamendra Solanki
Shyamendra Solanki

Reputation: 8851

Simply use a type parameter with printDS method:

def printDS[T](ds:Dataset[T]): Unit = {
  ds.show()
}

Upvotes: 4

Related Questions