raj247
raj247

Reputation: 401

Spark Scala: Access data inside struct which is inside of an array

The schema looks like this

root
|-- orderitemlist: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- internal-material-code: string (nullable = true)
| | |-- lot-number: string (nullable = true)
| | |-- packaging-item-code: string (nullable = true)
| | |-- packaging-item-code-type: string (nullable = true)

how do I access the values for internal-material-code and lot-number

on creating the dataframe I do this

df.withColumn("internalmaterialcode", col("orderitemlist")(0).getItem("internal-material-code"))

also

df.withColumn("internalmaterialcode", col("orderitemlist")(0)("internal-material-code"))

also as follows

df.withColumn("orderitemlistarray", explode(col("orderitemlist"))) 
.withColumn("internalmaterialcode", col("orderitemlistarray").getItem("internal-material-code")) 

also as follows

df.withColumn("orderitemlistarray", explode(col("orderitemlist"))) 
.withColumn("internalmaterialcode", col("orderitemlistarray.internal-material-code")) 

but it gives out null

I have seen similar looking schemas on stackoverflow questions but none of the answers were useful for me. Could someone answer it or direct me to the correct place.

Upvotes: 2

Views: 2125

Answers (2)

Sarath Chandra Vema
Sarath Chandra Vema

Reputation: 812

I went through the code block shared by you & it is working fine. Please go through my work here(as an extension to the earlier solution):

>>>df.withColumn("ves", $"orderitemlist.lot-number").show
+--------------------+--------+
|       orderitemlist|     ves|
+--------------------+--------+
|[[123, vv, pp, ll...|[vv, vv]|
+--------------------+--------+

>>>df.withColumn("vew", $"orderitemlist".getItem("lot-number")).show
+--------------------+--------+
|       orderitemlist|     vew|
+--------------------+--------+
|[[123, vv, pp, ll...|[vv, vv]|
+--------------------+--------+

Upvotes: 1

notNull
notNull

Reputation: 31540

After explode, select the newly created column and it will gives all the data from struct fields.

Example:

val va="""{
    "orderitemlist": [{
        "internal-material-code": "123",
        "lot-number": "vv",
        "packaging-item-code": "pp",
        "packaging-item-code-type": "ll"
    },{
        "internal-material-code": "234",
        "lot-number": "vv",
        "packaging-item-code": "pp",
        "packaging-item-code-type": "ll"
    }]
}"""

val df=spark.read.json(Seq(va).toDS).toDF

df.withColumn("arr",explode(col("orderitemlist"))).select("arr.*").show()

Result:

+----------------------+----------+-------------------+------------------------+
|internal-material-code|lot-number|packaging-item-code|packaging-item-code-type|
+----------------------+----------+-------------------+------------------------+
|                   123|        vv|                 pp|                      ll|
|                   234|        vv|                 pp|                      ll|
+----------------------+----------+-------------------+------------------------+

Now you will get all the columns from struct inside array..!!

Upvotes: 3

Related Questions