David
David

Reputation: 103

How to get the info in table header (schema)?

env: spark2.4.5

source: id-name.json

{"1": "a", "2": "b", "3":, "c"..., "n": "z"}

I load the .json file into spark Dataset with Json format and it is stored like:

+---+---+---+---+---+
| 1 | 2 | 3 |...| n |
+---+---+---+---+---+
| a | b | c |...| z |
+---+---+---+---+---+

And I want it to be generated like such result:

+------------+------+
|     id     | name |
+------------+------+
| 1          | a    |
| 2          | b    |
| 3          | c    |
| .          | .    |
| .          | .    |
| .          | .    |
| n          | z    |
+------------+------+

My solution using spark-sql:

select stack(n, '1', `1`, '2', `2`... ,'n', `n`) as ('id', 'name') from table_name;

It doesn't meet my demand because I don't want to hard-code all the 'id' in sql. Maybe using 'show columns from table_name' with 'stack()' can help? I would be very grateful if you could give me some suggestion.

Upvotes: 0

Views: 258

Answers (1)

s.polam
s.polam

Reputation: 10382

Create required values for stack dynamic & use it where ever it required. Please check below code to generate same values dynamic.

scala> val js = Seq("""{"1": "a", "2": "b","3":"c","4":"d","5":"e"}""").toDS
js: org.apache.spark.sql.Dataset[String] = [value: string]

scala> val df = spark.read.json(js)
df: org.apache.spark.sql.DataFrame = [1: string, 2: string ... 3 more fields]

scala> val stack = s"""stack(${df.columns.max},${df.columns.flatMap(c => Seq(s"'${c}'",s"`${c}`")).mkString(",")}) as (id,name)"""
exprC: String = stack(5,'1',`1`,'2',`2`,'3',`3`,'4',`4`,'5',`5`) as (id,name)

scala> df.select(expr(stack)).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+


scala> spark.sql(s"""select ${stack} from table """).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+


scala>

Updated Code to fetch data from json file

scala> "hdfs dfs -cat /tmp/sample.json".!
{"1": "a", "2": "b","3":"c","4":"d","5":"e"}
res4: Int = 0

scala> val df = spark.read.json("/tmp/sample.json")
df: org.apache.spark.sql.DataFrame = [1: string, 2: string ... 3 more fields]

scala> val stack = s"""stack(${df.columns.max},${df.columns.flatMap(c => Seq(s"'${c}'",s"`${c}`")).mkString(",")}) as (id,name)"""
stack: String = stack(5,'1',`1`,'2',`2`,'3',`3`,'4',`4`,'5',`5`) as (id,name)

scala> df.select(expr(stack)).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+

scala> df.createTempView("table")

scala> spark.sql(s"""select ${stack} from table """).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+

Upvotes: 2

Related Questions