Naman Agarwal
Naman Agarwal

Reputation: 654

How to run Spark on mesos in standalone mode

I have installed mesos on my local and configured it as mentioned in the mesos setup. Now I want to run spark on mesos installed on my local machine. I have configured spark according to the official documentation and have running single node hadoop cluster on my local machine. Spark binary package is copied to hdfs root directory and I have set the following properties in spark-env.sh:

export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=hdfs://spark-2.2.0-bin-hadoop2.7.tgz

and is spark-defaults.conf :

spark.executor.uri         hdfs://spark-2.2.0-bin-hadoop2.7.tgz

and running spark with:

/bin/spark-shell --master mesos://host:5050

is giving the following error:

ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'mesos://host:5050'

Please guide me what I am doing wrong and how to correct it.

Upvotes: 0

Views: 613

Answers (1)

Naman Agarwal
Naman Agarwal

Reputation: 654

I have successfully installed Apache Spark on Mesos. Please follow the below steps on your Ubuntu machine.

# Update the packages.
$ sudo apt-get update

# Install a few utility tools.
$ sudo apt-get install -y tar wget git

# Install the latest OpenJDK.
$ sudo apt-get install -y openjdk-8-jdk

# Install autotools (Only necessary if building from git repository).
$ sudo apt-get install -y autoconf libtool

# Install other Mesos dependencies.
$ sudo apt-get -y install build-essential python-dev python-six python-virtualenv libcurl4-nss-dev libsasl2-dev libsasl2-modules maven libapr1-dev libsvn-dev

# Change working directory.
$ cd mesos

#If getting error "libz is required to build mesos"
$sudo apt install zlib1g-dev

# Configure and build.
$ mkdir build
$ cd build
build$../configure
$ make
$ make check
$ make install

$ cd build
#start master
./bin/mesos-master.sh --ip=127.0.0.1 --work_dir=/tmp/mesos
#start slave
./bin/mesos-slave.sh --master=127.0.0.1:5050 --work_dir=/tmp/mesos 

#If facing permission issue
./bin/mesos-slave.sh --master=127.0.0.1:5050 --work_dir=/tmp/mesos --no-systemd_enable_support

#Configuring Spark with Mesos

build spark with mesos support
./build/mvn -Pmesos -DskipTests clean package

In spark-env.sh
export MESOS_NATIVE_JAVA_LIBRARY= /usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=/localpath/to/spark-2.2.0-bin-hadoop2.7.tgz

In spark-defaults.conf
spark.executor.uri         /localpath/to/spark-2.2.0-bin-hadoop2.7.tgz

run spark shell
./bin/spark-shell --master mesos://127.0.0.1:5050

#Mesos UI
http://127.0.0.1:5050

Upvotes: 0

Related Questions