TheMP
TheMP

Reputation: 8427

Aggregate function in spark-sql not found

I am new to Spark and I am trying to make use of some aggregate features, like sum or avg. My query in spark-shell works perfectly:

val somestats = pf.groupBy("name").agg(sum("days")).show()

When I try to run it from scala project it is not working thou, throwing an error messageé

not found: value sum

I have tried to add

import sqlContext.implicits._
import org.apache.spark.SparkContext._

just before the command, but it does not help. My spark version is 1.4.1 Am I missing anything?

Upvotes: 14

Views: 14312

Answers (2)

Zernike
Zernike

Reputation: 1766

You can use sum method directly on GroupedData (groupBy return this type)

val somestats = pf.groupBy("name").sum("days").show()

Upvotes: 1

Justin Pihony
Justin Pihony

Reputation: 67135

You need this import:

import org.apache.spark.sql.functions._

Upvotes: 38

Related Questions