Reputation: 53
Iam unable to run the spark example (https://spark.incubator.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala).
sbt package command ran successfully. But sbt run command gives error.
Output for sbt package
[info] Set current project to Simple Project (in build file:/home/raghuveer/Spark/)
[info] Updating {file:/home/raghuveer/Spark/}default-13c61e...
[info] Resolving com.codahale.metrics#metrics-graphite;3.0.0 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/raghuveer/Spark/target/scala-2.10/classes...
[info] Packaging /home/raghuveer/Spark/target/scala-2.10/simple-project_2.10-1.0.jar ...
[info] Done packaging.
[success] Total time: 16 s, completed Feb 27, 2014 6:19:14 PM
Error for sbt run
ERROR executor.Executor: Exception in task ID 0 java.io.IOException: Server returned HTTP response code: 504 for URL:http://10.135.217.189:49650/jars/simple-project_2.10-1.0.jar
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1403)
at java.net.URL.openStream(URL.java:1031)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:253)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$6.apply(Executor.scala:345)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$6.apply(Executor.scala:343)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:343)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:194)
at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:701)
[error] (run-main) org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed 1 times (most recent failure:
Exception failure: java.io.IOException: Server returned HTTP response code: 504 for URL: http://10.135.217.189:49650/jars/simple-project_2.10-1.0.jar)org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed 1 times (most recent failure:
Exception failure: java.io.IOException: Server returned HTTP response code: 504 for URL: http://10.135.217.189:49650/jars/simple-project_2.10-1.0.jar)
And the trace is
[trace] Stack trace suppressed: run last compile:run for the full output.
14/02/27 18:20:58 INFO network.ConnectionManager: Selector thread was interrupted!
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 36 s, completed Feb 27, 2014 6:20:58 PM
EDIT: I disconnected the net connection and now the java.io.IOException: Server returned HTTP response code: 504 doesn't come and it ran successfully and shows the output.But i couldn't make it why its happening in that way.
Upvotes: 0
Views: 1444
Reputation: 6139
This post shares,how to create a Spark-streaming stand-alone application and how to run the Spark applications in scala-SDK (Eclipse IDE).
Upvotes: 1