Reputation: 450
I am trying to read csv file from Amazon S3 and I need to set credential info at runtime. But I cant pass the credentials checking. Is there any alternative or any suggestion?
object AwsS3CSVTest {
def main(args: Array[String]): Unit = {
val conf = new Configuration();
conf.setString("fs.s3a.access.key", "***")
conf.setString("fs.s3a.secret.key", "***")
val env = ExecutionEnvironment.createLocalEnvironment(conf)
val datafile = env.readCsvFile("s3a://anybucket/anyfile.csv")
.ignoreFirstLine()
.fieldDelimiter(";")
.types(classOf[String], classOf[String], classOf[String], classOf[String], classOf[String], classOf[String])
datafile.print()
}
}
00:49:55.558|DEBUG| o.a.h.f.s.AWSCredentialProviderList No credentials from TemporaryAWSCredentialsProvider: org.apache.hadoop.fs.s3a.auth.NoAwsCredentialsException: Session credentials in Hadoop configuration: No AWS Credentials
00:49:55.558|DEBUG| o.a.h.f.s.AWSCredentialProviderList No credentials from SimpleAWSCredentialsProvider: org.apache.hadoop.fs.s3a.auth.NoAwsCredentialsException: SimpleAWSCredentialsProvider: No AWS credentials in the Hadoop configuration
00:49:55.558|DEBUG| o.a.h.f.s.AWSCredentialProviderList No credentials provided by EnvironmentVariableCredentialsProvider: com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
Upvotes: 0
Views: 2546
Reputation: 2108
As explained on https://nightlies.apache.org/flink/flink-docs-stable/docs/deployment/filesystems/s3/#configure-access-credentials you should use IAM or Access Keys which you configure in flink-conf.yaml
. You can't set the credentials in code, because the S3 plugins are loaded via plugins.
Upvotes: 0