Reputation: 2074
So I wrote the basis (that doesnt work) on how to average every FloatType column in my data frame like so:
val descript = df.dtypes
var decimalArr = new ListBuffer[String]()
for(i <- 0 to (descript.length - 1)) {
if(descript(i)._2 == "FloatType") {
decimalArr += descript(i)._1
}
}
//Build Statsitical Arguments for DataFrame Pass
var averageList = new ListBuffer[String]()
for(i <- 0 to (decimalArr.length - 1)){
averageList += "avg(" + '"' + decimalArr(i) + '"' + ")"
}
//sample statsitical call
val sampAvg = df.agg(averageList).show
The example that gets produced by averageList is:
ListBuffer(avg("offer_id"), avg("decision_id"), avg("offer_type_cd"), avg("promo_id"), avg("pymt_method_type_cd"), avg("cs_result_id"), avg("cs_result_usage_type_cd"), avg("rate_index_type_cd"), avg("sub_product_id"))
The clear problem is that val sampAvg = df.agg(averageList).show does not allow listBuffer as the input. So even bringing it .toString doesnt work it wants org.apache.spark.sql.Column*. Does anyone know a way I can do something in the manner I am trying.
Side Note I am on Spark 1.3
Upvotes: 0
Views: 1561
Reputation: 330413
You can first build a list of the aggregate expressions
import org.apache.spark.sql.functions.{col, avg, lit}
val exprs = df.dtypes
.filter(_._2 == "DoubleType")
.map(ct => avg(col(ct._1))).toList
and either pattern match
exprs match {
case h::t => df.agg(h, t:_*)
case _ => sqlContext.emptyDataFrame
}
or use a dummy column
df.agg(lit(1).alias("_dummy"), exprs: _*).drop("_dummy")
If you want to use multiple functions you can flatMap
either explicitly:
import org.apache.spark.sql.Column
import org.apache.spark.sql.functions.{avg, min, max}
val funs: List[(String => Column)] = List(min, max, avg)
val exprs: Array[Column] = df.dtypes
.filter(_._2 == "DoubleType")
.flatMap(ct => funs.map(fun => fun(ct._1)))
or using for comprehension:
val exprs: Array[Column] = for {
cname <- df.dtypes.filter(_._2 == "DoubleType").map(_._1)
fun <- funs
} yield fun(cname)
Convert exprs
to List
if you want to use pattern match approach.
Upvotes: 3