Azimuts
Azimuts

Reputation: 1302

Amazon S3 Uploading via Java API : InputStream Sources

I'm testing different ways to upload small objects toS3 using "aws-java-sdk-s3". Being small objects I use the default api (the Transfer API for large and huge objects....)

  1. Uploading a File as a source, perfect !

     File file = ....
     s3Client.putObject(new PutObjectRequest(bucket, key, file));
    
  2. Uploading ByteArrayInputStream, perfect !

    InputStream  stream = new ByteArrayInputStream("How are you?".getBytes()))
    s3Client.putObject(new PutObjectRequest(bucket, key, stream  ));
    
  3. Updloading a Resource As Stream , problems .!

    InputStream stream  = this.getClass().getResourceAsStream("myFile.data");
    s3Client.putObject(new PutObjectRequest(bucket, key, stream  ));
    

The Exception:

com.amazonaws.ResetException: The request to the service failed with a retryable reason, but resetting the request input stream has failed.
 See exception.getExtraInfo or debug-level logging for the original failure that caused this retry.;  
If the request involves an input stream, the maximum stream buffer size can be configured via request.getRequestClientOptions().setReadLimit(int)

Caused by: java.io.IOException: Resetting to invalid mark
    at java.io.BufferedInputStream.reset(BufferedInputStream.java:448)
    at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:112)
    at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:112)
    at com.amazonaws.util.LengthCheckInputStream.reset(LengthCheckInputStream.java:126)
    at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:112)

I can convert the classpath resource to a File Object using som Apache File Utils, but its a bit shit......

  1. Do I have to configure the ReadLimit depending on the type of Stream ?¿?
  2. What value is recommended ¿?

API VersionM "aws-java-sdk-s3" rev="1.11.442"

Upvotes: 4

Views: 11719

Answers (2)

Guillaume Blanchet
Guillaume Blanchet

Reputation: 649

Your

 this.getClass().getResourceAsStream("myFile.data");

returns a BufferedInputStream (as you can see in the exception). When using a BufferedInputStream, you must set the buffer size to at least 128K (131072) as stated in AWS S3 doc:

When using an BufferedInputStream as data source, please remember to use a buffer of size no less than RequestClientOptions.DEFAULT_STREAM_BUFFER_SIZE while initializing the BufferedInputStream. This is to ensure that the SDK can correctly mark and reset the stream with enough memory buffer during signing and retries.

https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3Client.html#putObject-java.lang.String-java.lang.String-java.io.InputStream-com.amazonaws.services.s3.model.ObjectMetadata-

Upvotes: 1

Shinchan
Shinchan

Reputation: 361

I have implemented a use case which is pretty similar to yours(though not completely).I have to write some data in a JSON file(zipped format) and store it in S3. The data is available in a hash map.Hence the contents of Hashmap would be copied to JSON file.Please feel free to ignore if it does not help.Also I have never set any sort of limiting anywhere.

public void serializeResults(AmazonS3Client s3, Map<String, Object> dm, String environment)
        throws IOException {
    logger.info("start writeZipToS3");
    Gson gson = new GsonBuilder().create();
    try {
        ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
        ZipOutputStream zout = new ZipOutputStream(byteOut);

        ZipEntry ze = new ZipEntry(String.format("results-%s.json", environment));
        zout.putNextEntry(ze);
        String json = gson.toJson(dm);
        zout.write(json.getBytes());
        zout.closeEntry();
        zout.close();
        byte[] bites = byteOut.toByteArray();
        ObjectMetadata om = new ObjectMetadata();
        om.setContentLength(bites.length);
        PutObjectRequest por = new PutObjectRequest("home",
                String.format("zc-service/results-%s.zip", environment),
                new ByteArrayInputStream(bites), om);
        s3.putObject(por);

    } catch (IOException e) {
        e.printStackTrace();
    }
    logger.info("stop writeZipToS3");
}

I hope that helps you.

Regards

Upvotes: 1

Related Questions