RaAm
RaAm

Reputation: 1092

Fetch Spark dataframe column list

How to get all the column names in a spark dataframe into a Seq variable .

Input Data & Schema

val dataset1 = Seq(("66", "a", "4"), ("67", "a", "0"), ("70", "b", "4"), ("71", "d", "4")).toDF("KEY1", "KEY2", "ID")

dataset1.printSchema()
root
|-- KEY1: string (nullable = true)
|-- KEY2: string (nullable = true)
|-- ID: string (nullable = true)

I need to store all the column names in variable using scala programming . I have tried as below , but its not working.

val selectColumns = dataset1.schema.fields.toSeq

selectColumns: Seq[org.apache.spark.sql.types.StructField] = WrappedArray(StructField(KEY1,StringType,true),StructField(KEY2,StringType,true),StructField(ID,StringType,true))

Expected output:

val selectColumns = Seq(
  col("KEY1"),
  col("KEY2"),
  col("ID")
)

selectColumns: Seq[org.apache.spark.sql.Column] = List(KEY1, KEY2, ID)

Upvotes: 25

Views: 96748

Answers (5)

Abhi
Abhi

Reputation: 6568

The columns can be fetched from schema too.

val dataset1 = Seq(("66", "a", "4"), ("67", "a", "0"), ("70", "b", "4"), ("71", "d", "4")).toDF("KEY1", "KEY2", "ID")
dataset1.printSchema()
root
 |-- KEY1: string (nullable = true)
 |-- KEY2: string (nullable = true)
 |-- ID: string (nullable = true)

val selectColumns = dataset1.schema.fieldNames
selectColumns: Array[String] = Array(KEY1, KEY2, ID)

val selectColumns2 = dataset1.schema.fieldNames.toSeq 
selectColumns2: Seq[String] = WrappedArray(KEY1, KEY2, ID)

Upvotes: 3

Krishna Reddy
Krishna Reddy

Reputation: 1099

we can get the column names of a dataset / table into a Sequence variable in following ways.

from Dataset,

val col_seq:Seq[String] = dataset.columns.toSeq

from table,

val col_seq:Seq[String] = spark.table("tablename").columns.toSeq
                           or
val col_seq:Seq[String] = spark.catalog.listColumns("tablename").select('name).collect.map(col=>col.toString).toSeq

Upvotes: 3

uh_big_mike_boi
uh_big_mike_boi

Reputation: 3470

I use the columns property like so

val cols = dataset1.columns.toSeq

and then if you are selecting all the columns later on in the order of the Sequence from head to tail you can use

val orderedDF = dataset1.select(cols.head, cols.tail:_ *)

Upvotes: 7

RaAm
RaAm

Reputation: 1092

val selectColumns = dataset1.columns.toList.map(col(_))

Upvotes: 12

Yaron
Yaron

Reputation: 10450

You can use the following command:

val selectColumns = dataset1.columns.toSeq

scala> val dataset1 = Seq(("66", "a", "4"), ("67", "a", "0"), ("70", "b", "4"), ("71", "d", "4")).toDF("KEY1", "KEY2", "ID")
dataset1: org.apache.spark.sql.DataFrame = [KEY1: string, KEY2: string ... 1 more field]

scala> val selectColumns = dataset1.columns.toSeq
selectColumns: Seq[String] = WrappedArray(KEY1, KEY2, ID)

Upvotes: 30

Related Questions