D. Müller
D. Müller

Reputation: 3426

Use Apache Zeppelin with existing Spark Cluster

I want to install Zeppelin to use my existing Spark cluster. I used the following way:

I downladed the Zeppelin v0.5.5 and installed it via:

mvn clean package -Pspark-1.5 -Dspark.version=1.5.0 -Dhadoop.version=2.4.0 -Phadoop-2.4 -DskipTests

I saw, that the local[*] master setting works also without my Spark Cluster (notebook also runnable when shutted down the Spark cluster).

My problem: When I want to use my Spark Cluster for a Streaming application, it seems not to work correctly. My SQL-Table is empty when I use spark://my_server:7077 as master - in local mode everything works fine!

See also my other question which describes the problem: Apache Zeppelin & Spark Streaming: Twitter Example only works local

Did I something wrong

Upvotes: 1

Views: 1731

Answers (1)

D. Müller
D. Müller

Reputation: 3426

The problem was caused by a missing library dependency! So before searching around too long, first check the dependencies, whether one is missing!

%dep
z.reset
z.load("org.apache.spark:spark-streaming-twitter_2.10:1.5.1")

Upvotes: 2

Related Questions