Jens Englert
Jens Englert

Reputation: 1

connecting to spark standalone cluster does not work within RStudio

I created a virtual machine running an ubuntu server 16.04. I've already installed spark and all dependencies & prerequisites. My Spark cluster is running on the VM and all workers and the master can be started by start-all.sh.
Now I'm trying to submit sparkR jobs to this cluster by using Rstudio from my local computer. I specified the sparkContext with master="spark://192.168.0.105:7077" to connect to the cluster, which is obviously running, when calling the IP:8080 master webUI. Is there any config, that has been specified, to call the master from another device, which is not part of the cluster yet?

The error in R is:

Error in handleErrors(returnStatus, conn) : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

Upvotes: 0

Views: 602

Answers (1)

Akshay Kadidal
Akshay Kadidal

Reputation: 535

You could try using the Livy rest api interface. https://livy.incubator.apache.org/

see sparklyr - Connect remote hadoop cluster

Upvotes: 0

Related Questions