Reputation: 635
I am trying to access a bucket and all its object using AWS SDK but while running the code i am getting an error as Exception in thread "main" com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: X), S3 Extended Request ID: Y=
Kindly suggest, where i am lacking and why access denied error is occurring although i have taken all following permission to the bucket:
s3:GetObject
s3:GetObjectVersion
s3:GetObjectAcl
s3:GetBucketAcl
s3:GetBucketCORS
s3:GetBucketLocation
s3:GetBucketLogging
s3:ListBucket
s3:ListBucketVersions
s3:ListBucketMultipartUploads
s3:GetObjectTorrent
s3:GetObjectVersionAcl
Code is as follows:
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
ClientConfiguration clientConfig = new ClientConfiguration();
clientConfig.setProtocol(Protocol.HTTP);
AmazonS3 conn = new AmazonS3Client(credentials, clientConfig);
conn.setEndpoint(bucketName);
Bucket bucket = conn.createBucket(bucketName);
ObjectListing objects = conn.listObjects(bucket.getName());
do {
for (S3ObjectSummary objectSummary : objects.getObjectSummaries()) {
System.out.println(objectSummary.getKey() + "\t" +
objectSummary.getSize() + "\t" +
StringUtils.fromDate(objectSummary.getLastModified()));
}
objects = conn.listNextBatchOfObjects(objects);
} while (objects.isTruncated());
Upvotes: 36
Views: 81679
Reputation: 437
Please check and add your region
AWSCredentials credentials = new BasicAWSCredentials("accessKey", "secretKey");
AmazonS3 s3client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(credentials)).withRegion(Regions.AP_SOUTHEAST_1)
.build();
Upvotes: 0
Reputation: 1
// This is save s3 buket image code
byte imageBytes[] = request.getThumbnail().readAllBytes();
InputStream inputStream = new ByteArrayInputStream(imageBytes);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(imageBytes.length);
metadata.setContentType("image/png");
String image = String.valueOf(System.currentTimeMillis());
// String image=String.valueOf(System.getProperty(String.valueOf(inputRequest)));
// String image=String.valueOf(System.getProperty(String.valueOf(inputRequest)));
String key = "image/" + image;
s3.putObject(new PutObjectRequest(bucketName, key, inputStream, metadata)
.withCannedAcl(CannedAccessControlList.Private));
Upvotes: 0
Reputation: 7099
I was getting the same exception and this is how I fixed it.
The S3 bucket objects were encrypted using service-side KMS. I had to add the app/lambda role as a user to the encryption key.
Upvotes: 1
Reputation: 879
If you still see the error even after setting the right IAM policy and checking the bucket/path, check the apache http client dependency. The apache http client 4.5.5 works fine, while 4.5.7 and above fails for some weird reason (not properly encoding the folder path separators). You will have to explicitly set the apache http client version to 4.5.5 in that case.. or at least some other version that works.
Upvotes: 0
Reputation: 635
The problem is now solved. There were following issue to the code:
Below is the correct code
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
ClientConfiguration clientConfig = new ClientConfiguration();
clientConfig.setProtocol(Protocol.HTTP);
AmazonS3 conn = new AmazonS3Client(credentials, clientConfig);
conn.setEndpoint("correct end point");
Bucket bucket = conn.createBucket(bucketName);
ObjectListing objects = conn.listObjects(bucket.getName());
do {
for (S3ObjectSummary objectSummary : objects.getObjectSummaries()) {
System.out.println(objectSummary.getKey() + "\t" +
objectSummary.getSize() + "\t" +
StringUtils.fromDate(objectSummary.getLastModified()));
}
objects = conn.listNextBatchOfObjects(objects);
} while (objects.isTruncated());
Upvotes: 10
Reputation: 781
In permission tab of bucket, i uncheck:
- Manage public access control lists (ACLs) for this bucket
- Block new public ACLs and uploading public objects (Recommended)
and the problem gone.
Upvotes: 0
Reputation: 16522
Go to IAM and check whether the user [ Access Key & Secret Key ] which is being used for the API has the previliges to use S3 Based API.
Attached S3 Policy to the specified User - try with S3 Full Access; you can fine-grain the access once this works. For More Information Check this Link [ Managing IAM Policies ]
Upvotes: 38