Reputation: 6095
I want to Save and Read Spark DataFrame from AWS S3. I googled it a lot but found nothing much of use.
The code that I have written is like this:
val spark = SparkSession.builder().master("local").appName("test").getOrCreate()
spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "**********")
spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "********************")
import spark.implicits._
spark.read.textFile("s3n://myBucket/testFile").show(false)
List(1,2,3,4).toDF.write.parquet("s3n://myBucket/test/abc.parquet")
But when run it I get following error:
org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/myBucket/testFile' - ResponseCode=403, ResponseMessage=Forbidden
[info] at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.handleServiceException(Jets3tNativeFileSystemStore.java:245)
[info] at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:119)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
[info] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
[info] at org.apache.hadoop.fs.s3native.$Proxy15.retrieveMetadata(Unknown Source)
[info] at org.apache.hadoop.fs.s3native.NativeS3FileSystem.getFileStatus(NativeS3FileSystem.java:414)
[info] ...
[info] Cause: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/myBucket/testFile' - ResponseCode=403, ResponseMessage=Forbidden
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:477)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestHead(RestS3Service.java:718)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectImpl(RestS3Service.java:1599)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectDetailsImpl(RestS3Service.java:1535)
[info] at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1987)
[info] at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1332)
[info] at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:111)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] ...
[info] Cause: org.jets3t.service.impl.rest.HttpException:
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:475)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestHead(RestS3Service.java:718)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectImpl(RestS3Service.java:1599)
[info] at org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectDetailsImpl(RestS3Service.java:1535)
[info] at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1987)
[info] at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1332)
[info] at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:111)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] ...
I am using
Any help is appreciated !
Upvotes: 1
Views: 2129
Reputation: 461
I have tried following things on spark version 2.1.1 and its worked fine for me.
Step 1: Download following jars:
-- hadoop-aws-2.7.3.jar
-- aws-java-sdk-1.7.4.jar
Note:
If you not able to find the following jars, then you can get the jars from hadoop-2.7.3
Step 2: Place the above jars into $SPARK_HOME/jars/
Step 3: code:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
val conf = new SparkConf().setMaster("local").setAppName("My App")
val sc = new SparkContext(conf)
sc.getOrCreate.hadoopConfiguration.set("fs.s3a.access.key"", "***********")
sc.getOrCreate.hadoopConfiguration.set("fs.s3a.secret.key", "******************")
val input = sc.textFile("s3a://mybucket/*.txt")
List(1,2,3,4).toDF.write.parquet("s3a://mybucket/abc.parquet")
Upvotes: 2
Reputation: 13480
Set the secrets in the spark conf itself with an option like "spark.hadoop.fs.s3n...", so that spark propagates it around with the work
Upvotes: 1