Sai
Sai

Reputation: 3957

How to set aws access key and aws secret key inside spark-shell

Can you let me know the best way to set aws access key and aws secret key while inside spark-shell. I tried setting it using

sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", MY_ACCESS_KEY)
sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", MY_SECRET_KEY)

and got

java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properties (respectively)

I am able to get it to work by passing it as part of the url

s3n://MY_ACCESS_KEY:MY_SECRET_KEY@BUCKET_NAME/KEYNAME

after replacing the slashes in my secret key with %2F but wanted to know if there was an alternative to embedding my access key and secret key in the url.

Upvotes: 3

Views: 5089

Answers (2)

Erik Schmiegelow
Erik Schmiegelow

Reputation: 2759

in Addition to Holden's answer, here's amore specific example:

val jobConf = new JobConf(sparkContext.hadoopConfiguration)
    jobConf.set("fs.s3n.awsAccessKeyId", MY_ACCESS_KEY)
    jobConf.set("fs.s3n.awsSecretAccessKey", MY_SECRET_KEY)

val rdd = sparkContext.hadoopFile(jobConf, ...)

Upvotes: 3

Holden
Holden

Reputation: 7452

You can use the hadoopRDD function and specify the JobConf object directly with the required properties.

Upvotes: 2

Related Questions