Nikita Vlasenko
Nikita Vlasenko

Reputation: 4352

Compiling Spark program: no 'lib' directory

I am going through the tutorial:

https://www.tutorialspoint.com/apache_spark/apache_spark_deployment.htm

When I got to the Step 2: Compile program section I got stuck, because there is no lib folder in the spark directory which looks the following way:

enter image description here

Where is the lib folder? How could I compile the program? I looked into the jars folder but there is no file named spark-assembly-1.4.0-hadoop2.6.0.jar

Upvotes: 0

Views: 370

Answers (1)

addmeaning
addmeaning

Reputation: 1398

I am sorry I am not answering your question directly, but I want to guide you to the more convenient development process of Spark application.

When you are developing Spark application on your local computer you should use sbt (scala build tool). After you done writing code you should compile it with sbt (running sbt assembly). Sbt will produce 'fat jar' archive, that already has all required dependencies for a job. Then you should upload jar to spark cluster (for example using spark-submit script). There is no reason to install sbt on your cluster because it is needed only for compilation.

You should check starter project that I created for you.

Upvotes: 1

Related Questions