gsm113
gsm113

Reputation: 41

How can I convert all decimal columns in a Scala data frame to double type?

I have a data frame with decimal and string types. I want to cast all decimal columns as double without naming them. I've tried this without success. Kind of new to spark.

>df.printSchema

root

 |-- var1: decimal(38,10) (nullable = true)
 |-- var2: decimal(38,10) (nullable = true)
 |-- var3: decimal(38,10) (nullable = true)
…
150 more decimal and string columns

I try:

import org.apache.spark.sql.types._

val cols = df.columns.map(x => {
    if (x.dataType == DecimalType(38,0)) col(x).cast(DoubleType) 
    else col(x)
})

I get

<console>:30: error: value dataType is not a member of String
           if (x.dataType == DecimalType(38,0)) col(x).cast(DoubleType)

Upvotes: 1

Views: 7804

Answers (1)

abiratsis
abiratsis

Reputation: 7316

The issue here is that df.columns will return a string list containing column names. dataType on the other hand is a member of StructField class. To get the DataType you must use df.schema.fields instead. This will expose the list of the fields into a Array[StructField]:

import org.apache.spark.sql.types.{StructField, DecimalType, DoubleType}
import org.apache.spark.sql.functions.col

val df = Seq(
(130, Decimal(122.45), "t1"),
(536, Decimal(1.45), "t2"),
(518, Decimal(0.45), "t3"))
.toDF("ID","decimal","tmp")

df.printSchema
// root
//  |-- ID: integer (nullable = false)
//  |-- decimal: decimal(38,18) (nullable = true)
//  |-- tmp: string (nullable = true)

val decimalSchema = df.schema.fields.map{f =>
  f match{
    case StructField(name:String, _:DecimalType, _, _) => col(name).cast(DoubleType)
    case _ => col(f.name)
  }
}

df.select(decimalSchema:_*).printSchema
// root
//  |-- ID: integer (nullable = false)
//  |-- decimal: double (nullable = true)
//  |-- tmp: string (nullable = true)

Map will return a list of columns where DecimalType is replaced with DoubleType.

Upvotes: 7

Related Questions