Ali
Ali

Reputation: 471

How to install Spark as a daemon?

I start Spark as master and slave on two machines with this guide:
https://www.tutorialkart.com/apache-spark/how-to-setup-an-apache-spark-cluster/
Then I create a systemd daemon for each of them but when I start them as a service they fail to start. Here is my systemctl status:

● sparkslave.service - Spark Slave
   Loaded: loaded (/etc/systemd/system/sparkslave.service; enabled; ven
dor preset: enabled)
   Active: inactive (dead) since Mon 2019-12-09 07:30:22 EST; 55s ago
  Process: 31680 ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://1
72.16.3.90:7077 (code=exited, status=0/SUCCESS)
 Main PID: 31680 (code=exited, status=0/SUCCESS)

Dec 09 07:30:19 SparkSlave1 systemd[1]: Started Spark Slave.
Dec 09 07:30:19 SparkSlave1 start-slave.sh[31680]: starting org.apache.
spark.deploy.worker.Worker, logging to /usr/lib/spark/logs/spark-spark-
user-org.apache.spark.deploy.worker.Worker-1-SparkSlave1.out

And this is my sparkslave.service:

[Unit]
Description=Spark Slave
After=network.target

[Service]
User=spark-user
WorkingDirectory=/usr/lib/spark/sbin
ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://172.16.3.90:7077
Restart=on-failure
RestartSec=10s

[Install]
WantedBy=multi-user.target

What is the problem?

Upvotes: 1

Views: 2145

Answers (1)

Ali
Ali

Reputation: 471

Service type must change from simple to forking:

[Service]
Type=forking

Upvotes: 2

Related Questions