M. Wadi
M. Wadi

Reputation: 111

Jupyter Notebook error while using PySpark Kernel: the code failed because of a fatal error: Error sending http request

I and using jupyter notebook's PySpark kernel, I have successfully selected PySpark kernel but I keep getting the below error

The code failed because of a fatal error: Error sending http request and maximum retry encountered.. Some things to try:

a) Make sure Spark has enough available resources for Jupyter to create a Spark context.

b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.

c) Restart the kernel.

here's the log also

2019-10-10 13:37:43,741 DEBUG   SparkMagics Initialized spark magics.
2019-10-10 13:37:43,742 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookLoaded,Timestamp: 2019-10-10 10:37:43.742475
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Loaded magics.
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Changed language.
2019-10-10 13:37:44,356 DEBUG   python_jupyter_kernel   Registered auto viz.
2019-10-10 13:37:45,440 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationStart,Timestamp: 2019-10-10 10:37:45.440323,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark
2019-10-10 13:37:49,591 ERROR   ReliableHttpClient  Request to 'http://localhost:8998/sessions' failed with 'HTTPConnectionPool(host='localhost', port=8998): Max retries exceeded with url: /sessions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000013184159808>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))'
2019-10-10 13:37:49,591 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationEnd,Timestamp: 2019-10-10 10:37:49.591650,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark,SessionId: -1,Status: not_started,Success: False,ExceptionType: HttpClientException,ExceptionMessage: Error sending http request and maximum retry encountered.
2019-10-10 13:37:49,591 ERROR   SparkMagics Error creating session: Error sending http request and maximum retry encountered.

note that I am trying to configure this on windows. thanks alot

Upvotes: 10

Views: 28918

Answers (3)

AB_87
AB_87

Reputation: 1156

Posting below answer as it may help someone facing this issue when using sagemaker notebook with Glue Dev Endpoint.

I received same error message in my PySpark kernel notebook. In my case issue was missing Lifecycle configuration attached to notebook instance which was somehow removed. I delete and recreate dev endpoint every day but it lifecycle config normally remains attached to notebook.

Upvotes: -1

Sjoerd
Sjoerd

Reputation: 111

I faced the same issue, you can solve it by not using a PySpark Kernel (notebook) but a Python 3 kernel (notebook). I used the following code to setup the Spark cluster:

import pyspark # only run after findspark.init()
from pyspark.sql import SparkSession
# May take awhile locally
spark = SparkSession.builder.appName("test").getOrCreate()
spark

Upvotes: 8

Luis Galvez
Luis Galvez

Reputation: 461

If you are trying to connect your Jupyter Notebook to a Spark server through Livy (e.g. AWS Glue Development Endpoint), you have to replace "localhost" with the Spark server IP address in: ~/.sparkmagic/config.json

As mentioned here: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/

Upvotes: 0

Related Questions