Reputation: 147
How do I get this output using spark sql or scala ? I have a table with columns storing such values - need to split in seprate columns.
Output :
Upvotes: 0
Views: 441
Reputation: 3173
It pretty much depends on what libs you want to use (as you mentioned in Scala
or Spark).
val rawJson = """
{"Name":"ABC.txt","UploaddedById":"xxxxx1123","UploadedByName":"James"}
"""
spark.read.json(Seq(rawJson).toDS)
// play
Json.parse(rawJson) match {
case obj: JsObject =>
val values = obj.values
val keys = obj.keys
// construct dataframe having keys and values
case other => // handle other types (like JsArray, etc,.)
}
// circe
import io.circe._, io.circe.parser._
parse(rawJson) match {
case Right(json) => // fetch key values, construct df, much like above
case Left(parseError) => ...
}
You can use almost any json library to parse your json object, and then convert it to spark df very easily.
Upvotes: 1