WestCoastProjects
WestCoastProjects

Reputation: 63281

Pyspark / pyspark kernels not working in jupyter notebook

Here are installed kernels:

 $jupyter-kernelspec list


Available kernels:
  apache_toree_scala    /usr/local/share/jupyter/kernels/apache_toree_scala
  apache_toree_sql      /usr/local/share/jupyter/kernels/apache_toree_sql
  pyspark3kernel        /usr/local/share/jupyter/kernels/pyspark3kernel
  pysparkkernel         /usr/local/share/jupyter/kernels/pysparkkernel
  python3               /usr/local/share/jupyter/kernels/python3
  sparkkernel           /usr/local/share/jupyter/kernels/sparkkernel
  sparkrkernel          /usr/local/share/jupyter/kernels/sparkrkernel

A new notebook was created but fails with

The code failed because of a fatal error:
    Error sending http request and maximum retry encountered..

enter image description here

There is no [error] message in the jupyter console

Upvotes: 3

Views: 5675

Answers (1)

base64k
base64k

Reputation: 101

If you use magicspark to connect your Jupiter notebook, you should also start Livy which is API service used by magicspark to talk to your Spark cluster.

  1. Download Livy from Apache Livy and unzip it
  2. Check SPARK_HOME environment is set, if not, set to your Spark installation directory
  3. Run Livy server by <livy_home>/bin/livy-server in the shell/command line

Now go back to your notebook, you should be able to run spark code in cell.

Upvotes: 1

Related Questions