Liangju Zeng
Liangju Zeng

Reputation: 343

Compilation failed when I use sbt to build spark self-contained application

I was trying to build a complication by using sbt.

Code: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/ALSExample.scala

Tutorial: http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications

But errors happened.

[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:22:object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.evaluation.RegressionEvaluator
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:23: object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.recommendation.ALS
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:25: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SparkSession
[error]                         ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:46: not found: value SparkSession
[error]     val spark = SparkSession
[error]                 ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:61: not found: type ALS
[error]     val als = new ALS()
[error]                   ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed

Why did this happened? BTW, spark verion is 2.0.0.

Upvotes: 0

Views: 2849

Answers (1)

GameOfThrows
GameOfThrows

Reputation: 4510

so just as suspected, this error reflect on the fact that you did not include all the spark libraries in your build file, the ones you are missing is (are?) :

 "org.apache.spark" %% "spark-mllib" % "2.0.0"

if you are using Dataframes, you will also need:

 "org.apache.spark" %% "spark-sql" % "2.0.0"

Upvotes: 5

Related Questions