Reputation: 343
I was trying to build a complication by using sbt.
Tutorial: http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
But errors happened.
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:22:object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.evaluation.RegressionEvaluator
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:23: object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.recommendation.ALS
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:25: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:46: not found: value SparkSession
[error] val spark = SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:61: not found: type ALS
[error] val als = new ALS()
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
Why did this happened? BTW, spark verion is 2.0.0.
Upvotes: 0
Views: 2849
Reputation: 4510
so just as suspected, this error reflect on the fact that you did not include all the spark libraries in your build file, the ones you are missing is (are?) :
"org.apache.spark" %% "spark-mllib" % "2.0.0"
if you are using Dataframes, you will also need:
"org.apache.spark" %% "spark-sql" % "2.0.0"
Upvotes: 5