Wicky Memon
Wicky Memon

Reputation: 89

Java Heap Space is insufficient to upload files on AWS S3

I'm trying to upload a file on AWS S3 by using Java-AWS API. The problem is my application is unable to upload large sized files because the heap is reaching its limit. Error: java.lang.OutOfMemoryError: Java heap space

I personally think extending heap memory isn't a permanent solution because I have to upload file upto 100 gb. What should I do ?

Here is the code snippet:

        BasicAWSCredentials awsCreds = new BasicAWSCredentials(AID, Akey);
        AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
        .withRegion(Regions.fromName("us-east-2"))
        .withCredentials(new AWSStaticCredentialsProvider(awsCreds))
        .build();

        InputStream Is=file.getInputStream();

        boolean buckflag = s3Client.doesBucketExist(ABuck);
        if(buckflag != true){
           s3Client.createBucket(ABuck);
        }
        s3Client.putObject(new PutObjectRequest(ABuck, AFkey,file.getInputStream(),new ObjectMetadata() ).withCannedAcl(CannedAccessControlList.PublicRead));

Upvotes: 5

Views: 5338

Answers (2)

xerx593
xerx593

Reputation: 13261

I strongly recommend to setContentLength() on ObjectMetadata, since:

..If not provided, the library will have to buffer the contents of the input stream in order to calculate it.

(..which predictably will lead to OutOfMemory on "sufficient large" files.)

source: PutObjectRequest javadoc

Applied to your code:

 // ...
 ObjectMetadata omd = new ObjectMetadata();
 // a tiny code line, but with a "huge" information gain and memory saving!;)
 omd.setContentLength(file.length());

 s3Client.putObject(new PutObjectRequest(ABuck, AFkey, file.getInputStream(), omd).withCannedAcl(CannedAccessControlList.PublicRead));
 // ...

Upvotes: 10

vavasthi
vavasthi

Reputation: 952

You need to add example code to get a proper answer. If you are dealing with a large object, use TransferManager to upload rather than doing putObject.

Upvotes: 0

Related Questions