Reputation: 5238
I've installed jupyter notebook over python 3.5.2 on ubuntu server 16.04 I also have installed apache toree to run spark jobs from jupyter.
I run:
pip3 install toree
jupyter toree install --spark_home=/home/arik/spark-2.0.1-bin-hadoop2.7/ # My Spar directory
The output was a success:
[ToreeInstall] Installing Apache Toree version 0.1.0.dev8 [ToreeInstall] Apache Toree is an effort undergoing incubation at the Apache Software Foundation (ASF), sponsored by the Apache Incubator PMC.
Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects.
While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.
Additionally, this release is not fully compliant with Apache release policy and includes a runtime dependency that is licensed as LGPL v3 (plus a static linking exception). This package is currently under an effort to re-license (https://github.com/zeromq/jeromq/issues/327). [ToreeInstall] Creating kernel Scala [ToreeInstall] Removing existing kernelspec in /usr/local/share/jupyter/kernels/apache_toree_scala [ToreeInstall] Installed kernelspec apache_toree_scala in /usr/local/share/jupyter/kernels/apache_toree_scala
and i though that everthing was successful but everytime i create an apache toree notebook i see the following:
It says Kernel busy and all of my commands are ignored..
I couldn't find anything about this issue online. Alternatives to toree would also be accepted.
Thank you
Upvotes: 2
Views: 1360
Reputation: 141
Toree unfortunately does not work with Scala 2.11. Either you can downgrade to scala 2.10 with spark or use more recent version of toree(still in beta). The way I made it work with spark 2.1 and Scala 2.11:
#!/bin/bash
pip install -i https://pypi.anaconda.org/hyoon/simple toree
jupyter toree install --spark_home=$SPARK_HOME --user #will install scala + spark kernel
jupyter toree install --spark_home=$SPARK_HOME --interpreters=PySpark --user
jupyter kernelspec list
jupyter notebook #launch jupyter notebook
Look at this post and this post for more info.
It will eventually look like this:
Upvotes: 3