Reputation: 12592
Following is what I tried, but it didn't work. I think the problem is the generated keys are only generated for the scope of EC2 instance. So when Redshift tries to call COPY command with that then its not recognized.
AWSCredentialsProvider credentialProvider = new DefaultAWSCredentialsProviderChain();
return new StringBuilder("COPY ").append(tmpPrefix)
.append(tableName)
.append(" FROM '")
.append(filePath)
.append("' WITH CREDENTIALS 'aws_access_key_id=")
.append(credentialProvider.getCredentials().getAWSAccessKeyId())
.append(";aws_secret_access_key=")
.append(credentialProvider.getCredentials().getAWSSecretKey())
.append("' JSON 'auto' GZIP ACCEPTINVCHARS ' ' TRUNCATECOLUMNS TRIMBLANKS;")
.toString();
Following is the error i got
[Amazon](500310) Invalid operation: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId....
Any idea how to get this working?
Upvotes: 0
Views: 634
Reputation: 35149
You are using IAM Role ( which creates you set of temporary credentials + token ).
When you execute a COPY command using temp creds (access, secret, token ) you have provide token as well:
credentials 'aws_access_key_id=<temporary-access-key-id>;aws_secret_access_key=<temporary-secret-access-key>;token=<temporary-token>';
Take a look at the docs http://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html
Upvotes: 2