mark.lewis688
mark.lewis688

Reputation: 41

Bad SSL Key When Trying to Use spark-ec2 script to launch cluster on EC2?

Version of Apache Spark: spark-1.2.1-bin-hadoop2.4 Platform: Ubuntu

I have been using the spark-1.2.1-bin-hadoop2.4/ec2/spark-ec2 script to create temporary clusters on ec2 for testing. All was working well.

Then I started to get the following error when trying to launch the cluster:

[Errno 185090050] _ssl.c:344: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib

I have traced this back to the following line in the spark_ec2.py script:

conn = ec2.connect_to_region(opts.region)

Thus, the first time the script interacts with ec2, it is throwing this error. Spark is using the Python boto library (included with the Spark download) to make this call.

I assume the error I am getting is because of a bad cacert.pem file somewhere.

My question: which cacert.pem file gets used when I try to invoke the spark-ec2 script, and why is it not working?

I also had this error with spark-1.2.0-bin-hadoop2.4

Upvotes: 1

Views: 50

Answers (1)

mark.lewis688
mark.lewis688

Reputation: 41

SOLVED: the embedded boto library that comes with Spark found a ~/.boto config file I had for another non-Spark project (actually it was for the Google Cloud Services...GCS installed it, I had forgotten about it). That was screwing everything up.

As soon as I deleted the ~/.boto config file GCS installed, everything started working again for Spark!

Upvotes: 1

Related Questions