Reputation: 546
I have a list with more than 30 strings. how to convert list into dataframe . what i tried:
eg
Val list=List("a","b","v","b").toDS().toDF()
Output :
+-------+
| value|
+-------+
|a |
|b |
|v |
|b |
+-------+
Expected Output is
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
| a| b| v| a|
+---+---+---+---+
any help on this .
Upvotes: 11
Views: 45698
Reputation: 11
this will do:
val data = List(("Value1", "Cvalue1", 123, 2254, 22),("Value1", "Cvalue2", 124, 2255, 23));
val df = spark.sparkContext.parallelize(data).toDF("Col1", "Col2", "Expend1", "Expend2","Expend3");
val cols=Array("Expend1","Expend2","Expend3");
val df1=df
.withColumn("keys",lit(cols))
.withColumn("values",array($"Expend1",$"Expend2",$"Expend3"))
.select($"col1",$"col2",explode_outer(map_from_arrays($"keys", $"values")))
.show(false)
Upvotes: 1
Reputation: 419
In order to use toDF we have to import
import spark.sqlContext.implicits._
Please refer below code
val spark = SparkSession.
builder.master("local[*]")
.appName("Simple Application")
.getOrCreate()
import spark.sqlContext.implicits._
val lstData = List(List("vks",30),List("harry",30))
val mapLst = lstData.map{case List(a:String,b:Int) => (a,b)}
val lstToDf = spark.sparkContext.parallelize(mapLst).toDF("name","age")
lstToDf.show
val llist = Seq(("bob", "2015-01-13", 4), ("alice", "2015-04- 23",10)).toDF("name","date","duration")
llist.show
Upvotes: 10
Reputation: 1023
List("a","b","c","d")
represents a record with one field and so the resultset displays one element in each row.
To get the expected output, the row should have four fields/elements in it. So, we wrap around the list as List(("a","b","c","d"))
which represents one row, with four fields.
In a similar fashion a list with two rows goes as List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))
scala> val list = sc.parallelize(List(("a", "b", "c", "d"))).toDF()
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]
scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
| a| b| c| d|
+---+---+---+---+
scala> val list = sc.parallelize(List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))).toDF
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]
scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
| a1| b1| c1| d1|
| a2| b2| c2| d2|
+---+---+---+---+
Upvotes: 7