C Kondaiah
C Kondaiah

Reputation: 72

How to create schema in Spark with Scala if more than 100 columns in the input?

with case class we have some restrictions... with StructType is it possible to for 100+ columns, Is there any other way to create scheme for around 600+ columns.

Upvotes: 1

Views: 731

Answers (1)

pasha701
pasha701

Reputation: 7207

val columns = (1 to 600).map(i => s"Column_$i").map(cname => StructField(cname, StringType))
val schemaWithSixHundredsColumns = StructType(columns)
val df = spark.createDataFrame(new java.util.ArrayList[Row](), schemaWithSixHundredsColumns)

Upvotes: 3

Related Questions