Learn2Code
Learn2Code

Reputation: 147

How to read Key Value pair in spark SQL?

How do I get this output using spark sql or scala ? I have a table with columns storing such values - need to split in seprate columns.

Input : enter image description here

Output :

enter image description here

Upvotes: 0

Views: 441

Answers (1)

AminMal
AminMal

Reputation: 3173

It pretty much depends on what libs you want to use (as you mentioned in Scala or Spark).

  • Using spark
val rawJson = """
 {"Name":"ABC.txt","UploaddedById":"xxxxx1123","UploadedByName":"James"}
 """ 
spark.read.json(Seq(rawJson).toDS)
  • Using common json libraries:
// play
Json.parse(rawJson) match {
  case obj: JsObject =>
    val values = obj.values
    val keys = obj.keys
    // construct dataframe having keys and values
  case other => // handle other types (like JsArray, etc,.)
}
 // circe
import io.circe._, io.circe.parser._
parse(rawJson) match {
  case Right(json) => // fetch key values, construct df, much like above
  case Left(parseError) => ...
}

You can use almost any json library to parse your json object, and then convert it to spark df very easily.

Upvotes: 1

Related Questions