Himanshu Mehra
Himanshu Mehra

Reputation: 115

How to set spark-job-server config?

I am running spark-job-server 0.5.3 from ooyala. I have followed their official documents and it works fine when it is started by sbt using reStart command. But i can't

  1. make it work using server_start.sh script.

  2. unable to run it on a standalone cluster. its working on local[*] master bydefault now there are no clear doc about how to run the job-server on standalone cluster.

any solution or link to any blog or proper docs is appreciated.

Thanks in advance.

Upvotes: 1

Views: 2025

Answers (1)

Gillespie
Gillespie

Reputation: 2228

Documentation for the main spark job server project here: github.com/spark-jobserver

  • Copy config/local.sh.template to .sh and edit as
    appropriate. NOTE: be sure to set SPARK_VERSION if you need to
    compile against a different version, ie. 1.4.1 for job server 0.5.2
  • Copy config/shiro.ini.template to shiro.ini and edit as appropriate. NOTE: only required when authentication = on
  • Copy config/local.conf.template to .conf and edit as appropriate. bin/server_deploy.sh -- this packages the job server along with config files and pushes it to the remotes you have configured in .sh
  • On the remote server, start it in the deployed directory with server_start.sh and stop it with server_stop.sh
  • The server_start.sh script uses spark-submit under the hood and may be passed any of the standard extra arguments from spark-submit.

NOTE: by default the assembly jar from job-server-extras, which includes support for SQLContext and HiveContext, is used. If you face issues with all the extra dependencies, consider modifying the install scripts to invoke sbt job-server/assembly instead, which doesn't include the extra dependencies.

Upvotes: 1

Related Questions