Reputation: 163
I am trying to deploy spark application (Java) to spark engine as part of analytics engine service on Bluemix. I followed the step mention here
After following all the steps and at the time of spark-submit, i get the following error
C:\Users\IBM_ADMIN\eclipse-workspace\mySparkApp\target>bx ae spark-submit --className mySparkApp.Main mySparkApp-0.0.1-S
NAPSHOT.jar
Current user is 'clsadmin'
Password>
Contacting endpoint 'https://159.122.220.119:8443'...
FAILED
Server call failed. Message: '<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 500 Server Error</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /gateway/default/livy/v1/batches. Reason:
<pre> Server Error</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
'
I am able to access the ambari server and can see the spark services up and running. I am also able to access the livy API endpoint from the browser
https://chs-uvi-769-mn001.bi.services.eu-gb.bluemix.net:8443/gateway/default/livy/v1/batches
{"from":0,"total":5,"sessions":[{"id":0,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["java.lang.Exception: No YARN application is found with tag livy-batch-0-t8fc4ebv in 60 seconds. Please check your cluster status, it is may be very busy.","com.cloudera.livy.utils.SparkYarnApp.com$cloudera$livy$utils$SparkYarnApp$$getAppIdFromTag(SparkYarnApp.scala:182) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:248) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:245) scala.Option.getOrElse(Option.scala:120) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1.apply$mcV$sp(SparkYarnApp.scala:245) com.cloudera.livy.Utils$$anon$1.run(Utils.scala:95)"]},{"id":1,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["java.lang.Exception: No YARN application is found with tag livy-batch-1-1olxdmt5 in 60 seconds. Please check your cluster status, it is may be very busy.","com.cloudera.livy.utils.SparkYarnApp.com$cloudera$livy$utils$SparkYarnApp$$getAppIdFromTag(SparkYarnApp.scala:182) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:248) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:245) scala.Option.getOrElse(Option.scala:120) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1.apply$mcV$sp(SparkYarnApp.scala:245) com.cloudera.livy.Utils$$anon$1.run(Utils.scala:95)"]},{"id":2,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["java.lang.Exception: No YARN application is found with tag livy-batch-2-xbjzpkbp in 60 seconds. Please check your cluster status, it is may be very busy.","com.cloudera.livy.utils.SparkYarnApp.com$cloudera$livy$utils$SparkYarnApp$$getAppIdFromTag(SparkYarnApp.scala:182) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:248) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:245) scala.Option.getOrElse(Option.scala:120) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1.apply$mcV$sp(SparkYarnApp.scala:245) com.cloudera.livy.Utils$$anon$1.run(Utils.scala:95)"]},{"id":3,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["java.lang.Exception: No YARN application is found with tag livy-batch-3-sbilpm4a in 60 seconds. Please check your cluster status, it is may be very busy.","com.cloudera.livy.utils.SparkYarnApp.com$cloudera$livy$utils$SparkYarnApp$$getAppIdFromTag(SparkYarnApp.scala:182) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:248) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:245) scala.Option.getOrElse(Option.scala:120) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1.apply$mcV$sp(SparkYarnApp.scala:245) com.cloudera.livy.Utils$$anon$1.run(Utils.scala:95)"]},{"id":4,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["java.lang.Exception: No YARN application is found with tag livy-batch-4-rvlzpl8o in 60 seconds. Please check your cluster status, it is may be very busy.","com.cloudera.livy.utils.SparkYarnApp.com$cloudera$livy$utils$SparkYarnApp$$getAppIdFromTag(SparkYarnApp.scala:182) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:248) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1$$anonfun$4.apply(SparkYarnApp.scala:245) scala.Option.getOrElse(Option.scala:120) com.cloudera.livy.utils.SparkYarnApp$$anonfun$1.apply$mcV$sp(SparkYarnApp.scala:245) com.cloudera.livy.Utils$$anon$1.run(Utils.scala:95)"]}]}
I am not able to get what is the issue. I have enabled the bluemix trace and the output in debug mode for the same command is
C:\Users\IBM_ADMIN\eclipse-workspace\mySparkApp\target>SET BLUEMIX_TRACE=true
C:\Users\IBM_ADMIN\eclipse-workspace\mySparkApp\target>bx ae spark-submit --className mySparkApp.Main mySparkApp-0.0.1-S
NAPSHOT.jar
Current user is 'clsadmin'
Password>
DEBUG: INPUTS PROVIDED
file mySparkApp-0.0.1-SNAPSHOT.jar
proxyUser NOTSET
className mySparkApp.Main
inputArgs []
jars []
pyFiles []
files []
driverMemory NOTSET
driverCores -1
executorMemory NOTSET
executorCores -1
numExecutors -1
archives []
queue NOTSET
name NOTSET
conf map[]
asynchronous false
upload false
DEBUG: PROCESSED INPUTS
file mySparkApp-0.0.1-SNAPSHOT.jar
proxyUser NOTSET
className mySparkApp.Main
inputArgs []
jars []
pyFiles []
files []
driverMemory NOTSET
driverCores -1
executorMemory NOTSET
executorCores -1
numExecutors -1
archives []
queue NOTSET
name NOTSET
conf map[]
asynchronous false
upload false
Contacting endpoint 'https://159.122.220.119:8443'...
←[1;39mREQUEST:←[0m [2018-05-03T16:52:33+05:30]
POST /gateway/default/livy/v1/batches HTTP/1.1
Host: 159.122.220.119:8443
Accept: application/json
Authorization: [PRIVATE DATA HIDDEN]
Content-Type: application/json
X-Requested-By: livy
{"file":"mySparkApp-0.0.1-SNAPSHOT.jar","className":"mySparkApp.Main"}
←[1;39mRESPONSE:←[0m [2018-05-03T16:52:34+05:30] ←[1;39mElapsed:←[0m 1382ms
HTTP/1.1 500 Server Error
Connection: close
Content-Length: 321
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html; charset=ISO-8859-1
Date: Thu, 03 May 2018 11:22:52 GMT
Date: Thu, 03 May 2018 11:22:52 GMT
Server: Jetty(9.2.16.v20160414)
Set-Cookie: JSESSIONID=vyw2g1lyyjmo2d2spzs5pnd1;Path=/gateway/default;Secure;HttpOnly
Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Wed, 02-May-2018 11:22:52 GMT
Strict-Transport-Security: max-age=31536000
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 500 Server Error</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /gateway/default/livy/v1/batches. Reason:
<pre> Server Error</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
FAILED
Server call failed. Message: '<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 500 Server Error</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /gateway/default/livy/v1/batches. Reason:
<pre> Server Error</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
'
FAILED
Server call failed. Message: '←[1;31m<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 500 Server Error</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /gateway/default/livy/v1/batches. Reason:
<pre> Server Error</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
←[0m'
Can anyone see what is the issue ? any help or pointer ?
Upvotes: 0
Views: 929
Reputation: 15
Usually, I have noticed this error when there is an issue with the program. If you have access to the cluster i.e. ssh clsadmin@ , you can try running the java program using spark-submit and confirm that the program is valid and there is no issue.
Upvotes: 1
Reputation: 21
1) Could you please check the Livy log on the spark node (/var/log/livy2) for any error messages logged there? 2) You can also try restarting Livy service on your cluster?
On our test server able to run the sample code:
bx ae spark-submit --className org.apache.spark.examples.SparkPi local:/usr/hdp/current/spark2-client/jars/spark-examples.jar
User (clsadmin)>
Password>
Contacting endpoint 'https://169.60.167.93:8443'...
Job ID '8'
Waiting for job to return application ID. Will check every 10 seconds, and stop checking after 2 minutes. Press Control C to stop waiting.
Finished contacting endpoint 'https://169.60.167.93:8443'
OK
Job ID '8'
Application ID 'application_1521738019714_0083'
Done
Upvotes: 0