Reputation: 3050
From my local PC , i tried to load my DF into S3.. below are my code snippet.
sparkContext.hadoopConfiguration.set("fs.s3a.awsAccessKeyId", Util.AWS_ACCESS_KEY)
sparkContext.hadoopConfiguration.set("fs.s3a.awsSecretAccessKey", Util.AWS_SECRET_ACCESS_KEY)
sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
empTableDF.coalesce(1).write
.format("csv")
.option("header", "true")
.mode(SaveMode.Overwrite)
.save("s3a://welpocstg/")
While running i am getting below exception
com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
my pom.xml
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>2.7.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.6</version>
</dependency>
Upvotes: 0
Views: 1057
Reputation: 10372
Can you try with below change.
sparkContext.hadoopConfiguration.set("fs.s3a.access.key", Util.AWS_ACCESS_KEY)
sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", Util.AWS_SECRET_ACCESS_KEY)
Seq("1","2","3").toDF("id")
.coalesce(1)
.write
.format("csv")
.option("header", "true")
.mode(SaveMode.Overwrite)
.save("s3a://welpocstg/")
Upvotes: 1