Shashwat Tiwary
Shashwat Tiwary

Reputation: 71

Why am I getting error while importing SparkContext library in sagemaker notebook?

I am using SageMaker Notebook in AWS Glue for ETL development.

On importing the SparkContext library I am getting the below error. I have tried to restart the kernel but did not heled. Can some one explain me the point "a".

The code failed because of a fatal error: Error sending http request and maximum retry encountered..*

Some things to try:

a. Make sure Spark has enough available resources for Jupyter to create a Spark context.

b. Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.

c. Restart the kernel.

Following points to be noted:

  1. I am creating the sagemaker notebook from AWS Console > AWS Glue > Dev Endpoint > Notebooks.

  2. VPC, Subnet and Security group of the dev endpoint created is same as the RDS to which connection is supposed to be made. While creating dev endpoint, in the networking page I choose an existing connection from the list of connections available in the drop down so that VPC, subnet and security group are automatically chosen.

  3. I had increased the DPU from 5 to 10 but still getting this error.
  4. Not able reach the step where I can create connection to RDS because getting error while calling the library.
  5. If I skip the networking info while creating the dev end point I am successfully able to call all the relevant libraries (screenshot attached). (which is not suggested when connecting to RDS as it would not work).

So, this error ("The code failed because...") is coming only when providing a connection.

Would be helpful if some one could help out in resolving this issue.

When adding connections, getting this error

Without adding connections I am not getting error

Upvotes: 1

Views: 7079

Answers (1)

Urvashi kohli
Urvashi kohli

Reputation: 46

Wondering if you're configuration for Livy endpoint is valid? Livy runs on port 8998. You should check if the port is open in the security group.

This might be useful: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/

Also, if that does not help, you should try stopping and restarting the notebook once. That has helped in the past.

Upvotes: 3

Related Questions