Oleg Shirokikh
Oleg Shirokikh

Reputation: 3565

Creating Apache Spark-powered "As Service" applications

The question is about the ways to create a Windows desktop-based and/or web-based application client that is able to connect and talk to the server containing Spark application (either local or on-premise cloud distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may be a help in that, but I'm not so sure if they would be the best alternative and how they work yet:

  1. Spark Job Server - https://github.com/spark-jobserver/spark-jobserver - defines a REST API for Spark
  2. Hue - http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/ - uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that shows, e.g. how to build such client for simply creating Spark Context on a local machine and say reading text file and returning basic stats would be ideal answer!

Upvotes: 2

Views: 1026

Answers (1)

tgpfeiffer
tgpfeiffer

Reputation: 1838

You may want to have a look on how the guys at Adobe Research built their spindle platform. Personally I haven't investigated that in detail, but they are also providing "Spark query results as a service".

Upvotes: 2

Related Questions