madhu sudhan
madhu sudhan

Reputation: 177

Hadoop `Unable to load AWS credentials from any provider in the chain`

My use case is to upload files from onpremise Hadoop cluster to AWS S3. Before achieving it, i tried to do simple ls command to see bucket contents from Hadoop.

Before running any command I exported the keys something like this

export AWS_ACCESS_KEY_ID=<AcessKeyId>
export AWS_SECRET_ACCESS_KEY=<secretekey>
export AWS_SESSION_TOKEN=<sessiontoken>

I was able to see the bucket contents if I use the following command

aws s3api list-objects --bucket <bucket_name>

However, when I make use of the following command

hadoop fs -Dfs.s3a.aws.credentials.provider="org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider" -Dfs.s3.access.key="<Access key>" -Dfs.s3.secret.key="<Aws access key>" -Dfs.s3a.session.token="<session_token>" -libjars <path to hadoop-aws.jar file> -ls s3a://<bucket_name>/

I am getting the following error

AmazonClientException: Unable to load AWS credentials from any provider in the chain

My question is even though I am providing aws credentials both as environment variables, as well as in the command using -Dfs.s3.access.key why am I seeing the AmazonClientException

Upvotes: 2

Views: 8455

Answers (1)

jarmod
jarmod

Reputation: 78653

Remove the fs.s3a.aws.credentials.provider option and retry. If unspecified then the default list of credential provider classes is queried in sequence (see docs).

Upvotes: 4

Related Questions