Albert T. Wong
Albert T. Wong

Reputation: 1653

Using Minio, how to authenticate amazon s3 endpoint in java

So I have an Java app

java -jar utilities-0.1.0-SNAPSHOT-bundled.jar --datasetConfig onetable.yaml

I want it to connect to Minio

export AWS_ACCESS_KEY_ID=admin
export AWS_SECRET_ACCESS_KEY=password

What do I use for S3? Below doesn't work. It feels like it should have worked per https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-endpoints.html

export S3_ENDPOINT=http://minio:9000
export AWS_ENDPOINT_URL_S3=http://minio:9000
export AWS_ENDPOINT=http://minio:9000
export AWS_IGNORE_CONFIGURED_ENDPOINT_URLS=true

error message

WARNING: Runtime environment or build system does not support multi-release JARs. This will impact location-based features.
2024-02-22 23:09:01 INFO  io.onetable.utilities.RunSync:147 - Running sync for basePath s3://huditest/hudi-dataset/people for following table formats [DELTA, ICEBERG]
2024-02-22 23:09:01 INFO  org.apache.hudi.common.table.HoodieTableMetaClient:133 - Loading HoodieTableMetaClient from s3://huditest/hudi-dataset/people
2024-02-22 23:09:01 WARN  org.apache.hadoop.util.NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2024-02-22 23:09:01 WARN  org.apache.hadoop.metrics2.impl.MetricsConfig:136 - Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
2024-02-22 23:09:03 ERROR io.onetable.utilities.RunSync:170 - Error running sync for s3://huditest/hudi-dataset/people
org.apache.hudi.exception.HoodieIOException: Could not check if s3://huditest/hudi-dataset/people is a valid table
        at org.apache.hudi.exception.TableNotFoundException.checkTableValidity(TableNotFoundException.java:59) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:140) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.table.HoodieTableMetaClient.newMetaClient(HoodieTableMetaClient.java:692) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.table.HoodieTableMetaClient.access$000(HoodieTableMetaClient.java:85) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.table.HoodieTableMetaClient$Builder.build(HoodieTableMetaClient.java:774) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at io.onetable.hudi.HudiSourceClientProvider.getSourceClientInstance(HudiSourceClientProvider.java:42) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at io.onetable.hudi.HudiSourceClientProvider.getSourceClientInstance(HudiSourceClientProvider.java:31) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at io.onetable.client.OneTableClient.sync(OneTableClient.java:90) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at io.onetable.utilities.RunSync.main(RunSync.java:168) [utilities-0.1.0-SNAPSHOT-bundled.jar:?]
Caused by: java.nio.file.AccessDeniedException: s3://huditest/hudi-dataset/people/.hoodie: getFileStatus on s3://huditest/hudi-dataset/people/.hoodie: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: YCAK9KVEG2Y3GP3K; S3 Extended Request ID: GiIg4bCsQ12KMZlc1dLtjEttzoyOsW8Iix3rRQsYQIsUarafzOmmEru8ln6lZGWlP73jQkQd5dnXN9nBBdNoyLGqYjwwhgxE; Proxy: null), S3 Extended Request ID: GiIg4bCsQ12KMZlc1dLtjEttzoyOsW8Iix3rRQsYQIsUarafzOmmEru8ln6lZGWlP73jQkQd5dnXN9nBBdNoyLGqYjwwhgxE:403 Forbidden
        at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:249) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:170) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3286) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3185) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3053) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.fs.HoodieWrapperFileSystem.lambda$getFileStatus$17(HoodieWrapperFileSystem.java:410) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.fs.HoodieWrapperFileSystem.executeFuncWithTimeMetrics(HoodieWrapperFileSystem.java:114) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.common.fs.HoodieWrapperFileSystem.getFileStatus(HoodieWrapperFileSystem.java:404) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at org.apache.hudi.exception.TableNotFoundException.checkTableValidity(TableNotFoundException.java:51) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        ... 8 more
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: YCAK9KVEG2Y3GP3K; S3 Extended Request ID: GiIg4bCsQ12KMZlc1dLtjEttzoyOsW8Iix3rRQsYQIsUarafzOmmEru8ln6lZGWlP73jQkQd5dnXN9nBBdNoyLGqYjwwhgxE; Proxy: null)
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1879) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1418) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1387) ~[utilities-0.1.0-SNAPSHOT-bundled.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1157) ~[utilities-0.1.0-SNAPSHOT-bundled.jar

Reference: https://github.com/onetable-io/onetable/issues/327

Upvotes: 0

Views: 383

Answers (1)

Albert T. Wong
Albert T. Wong

Reputation: 1653

It wouldn't take variables from the shell. I had to execute

java -jar utilities-0.1.0-SNAPSHOT-bundled.jar --datasetConfig onetable.yaml -p ../conf/core-site.xml

Where the xml is https://github.com/StarRocks/demo/blob/master/documentation-samples/datalakehouse/conf/core-site.xml

Upvotes: 0

Related Questions