matanox
matanox

Reputation: 13686

making sbt scala build file bring in apache spark

I have a scala build file under my project directory like follows, but my imports fail, what is the idiomatic way to solve this? are scala build files still the recommended build as opposed to build.sbt definitions? the official documentation doesn't offer any insights.

import sbt.{Def, ExclusionRule, ModuleID}
import sbt.Keys.{dependencyOverrides, libraryDependencies}
import sbt._

object MyBuild {

  lazy val sparkVersion = "2.2.1"

  val commonDependencies: Seq[ModuleID] = Seq()

  val sparkDependencies: Seq[ModuleID] = Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.hadoop" % "hadoop-common" % sparkVersion sparkVersion,
    "org.apache.spark" %% "spark-hive" % sparkVersion,
    "org.apache.hadoop" % "hadoop-client" % "2.7.2"
  )

  lazy val project = Project("my-project", file("."))
    .settings(
      libraryDependencies ++= sparkDependencies
    )
}

In my source code I can't:

import org.apache.spark.sql.DataFrame

What's the simple solution? Do I need to specify that my object herein should be executed or does that just happen by default?

build.properties:

sbt.version = 0.13.16

Upvotes: 0

Views: 68

Answers (1)

Ashwanth Kumar
Ashwanth Kumar

Reputation: 677

You might want to make the following change to your Build definition

object MyBuild extends Build {
  ....
}

To avoid these kind of common issues, try moving to build.sbt approach of using build definitions in SBT, unless you've a lot of customizations in your build spec.

Upvotes: 1

Related Questions