Reputation: 71
I am using SageMaker Notebook in AWS Glue for ETL development.
On importing the SparkContext library I am getting the below error. I have tried to restart the kernel but did not heled. Can some one explain me the point "a".
The code failed because of a fatal error: Error sending http request and maximum retry encountered..*
Some things to try:
a. Make sure Spark has enough available resources for Jupyter to create a Spark context.
b. Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c. Restart the kernel.
Following points to be noted:
I am creating the sagemaker notebook from AWS Console > AWS Glue > Dev Endpoint > Notebooks.
VPC, Subnet and Security group of the dev endpoint created is same as the RDS to which connection is supposed to be made. While creating dev endpoint, in the networking page I choose an existing connection from the list of connections available in the drop down so that VPC, subnet and security group are automatically chosen.
So, this error ("The code failed because...") is coming only when providing a connection.
Would be helpful if some one could help out in resolving this issue.
Upvotes: 1
Views: 7079
Reputation: 46
Wondering if you're configuration for Livy endpoint is valid? Livy runs on port 8998. You should check if the port is open in the security group.
This might be useful: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/
Also, if that does not help, you should try stopping and restarting the notebook once. That has helped in the past.
Upvotes: 3