Reputation: 449
I am trying to run local jar file with spark-submit which is working perfectly fine. Here is the command-
spark-submit --class "SimpleApp" --master local myProject/target/scala-2.11/simple-project_2.11-1.0.jar
But when I am trying with curl
curl -X POST --data '{
"file": "file:///home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
It is throwing error
"requirement failed: Local path /home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar cannot be added to user sessions."
Here is my livy.conf file, as some article suggest to change few things.
# What host address to start the server on. By default, Livy will bind to all network interfaces.
livy.server.host = 0.0.0.0
# What port to start the server on.
livy.server.port = 8998
# What spark master Livy sessions should use.
livy.spark.master = local
# What spark deploy mode Livy sessions should use.
livy.spark.deploy-mode = client
# List of local directories from where files are allowed to be added to user sessions. By
# default it's empty, meaning users can only reference remote URIs when starting their
# sessions.
livy.file.local-dir-whitelist= /home/user/.livy-sessions/
Please help me out with this.
Thanks in Advance.
Upvotes: 8
Views: 6088
Reputation: 1
The below answer worked for me as stated in here Apache Livy cURL not working for spark-submit command
To use local files for livy batch jobs you need to add the local folder to the livy.file.local-dir-whitelist property in livy.conf.
Description from livy.conf.template:
List of local directories from where files are allowed to be added to user sessions. By default it's empty, meaning users can only reference remote URIs when starting their sessions.
Upvotes: 0
Reputation: 449
I recently got the solution of local file reading from Apache Livy as I was creating the wrong request with cURL. I just replaced file reading protocol from 'file://' with 'local:/' and that works for me.
curl -X POST --data '{
"file": "local:/home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
That was quite a small mistake but still, my jar file cannot be accessed from HDFS.
Thank you all for helping out.
Upvotes: 10
Reputation:
Presence of Apache Livy
jar file is the mandatory requirement. It wouldn't work without the corresponding jar file.
My advice is next: just append livy jar
file to classpath with java's cp option
:
java -cp /usr/local/livy.jar com.myclass.Main
or simply use SBT:
libraryDependencies += "org.apache.livy" % "livy-api" % "0.4.0-incubating"
Maven:
<dependency>
<groupId>org.apache.livy</groupId>
<artifactId>livy-api</artifactId>
<version>0.4.0-incubating</version>
</dependency>
Or your favorite build tool.
BTW, you also can upload livy jar
file to HDFS
and use it on your Hadoop cluster, it can significantly simplify your life.
Upvotes: 0