Alina
Alina

Reputation: 2261

Type change in spark dataframe struct

I have following schema:

root
 |-- Id: long (nullable = true)    
 |-- element: struct (containsNull = true)
 |    |-- Amount: double (nullable = true)
 |    |-- Currency: string (nullable = true)

I want to change the type of Amount to be integer. It does not work with withColumn as the type stays the same:

df.withColumn("element.Amount", $"element.Amount".cast(sql.types.IntegerType))

How do I change the column type that is in struct?

Upvotes: 1

Views: 2054

Answers (1)

Alper t. Turker
Alper t. Turker

Reputation: 35229

If you cannot fix the problem in the source, you can cast:

case class Amount(amount: Double, currency: String)
case class Row(id: Long, element: Amount)

val df = Seq(Row(1L, Amount(0.96, "EUR"))).toDF

val dfCasted = df.withColumn(
  "element", $"element".cast("struct<amount: integer, currency: string>")
)

dfCasted.show
// +---+--------+
// | id| element|
// +---+--------+
// |  1|[0, EUR]|
// +---+--------+


dfCasted.printSchema
// root
//  |-- id: long (nullable = false)
//  |-- element: struct (nullable = true)
//  |    |-- amount: integer (nullable = true)
//  |    |-- currency: string (nullable = true)



dfCasted.printSchema

In simple cases you can try to rebuild tree:

import org.apache.spark.sql.functions._

dfCasted.withColumn(
  "element",
  struct($"element.amount".cast("integer"), $"element.currency")
)
// org.apache.spark.sql.DataFrame = [id: bigint, element: struct<col1: int, currency: string>]

but it doesn't scale for complex trees.

Upvotes: 2

Related Questions