sai krishna
sai krishna

Reputation: 1

Compiling with scalac does not find sbt dependencies

I tried running my Scala code in VSCode editor. I am able to run my script via spark-submit command. But when I am trying with scalac to compile, I am getting:

.\src\main\scala\sample.scala:1: error: object apache is not a member of package org import org.apache.spark.sql.{SQLContext,SparkSession}

I have already added respective library dependencies to build.sbt.

screenshot

Upvotes: 0

Views: 99

Answers (1)

Gaël J
Gaël J

Reputation: 15275

Have you tried running sbt compile?

Running scalac directly means you're compiling only one file, without the benefits of sbt and especially the dependencies that you have added in your build.sbt file.

In a sbt project, there's no reason to use scalac directly. This defeats the purpose of sbt.

Upvotes: 2

Related Questions