lee
lee

Reputation: 2439

How can I upload files larger than 5GB to Amazon S3?

I'm currently using Rails 3.2 with the Carrierwave gem to upload files to Amazon S3. Now I need to be able to handle user-submitted files larger than 5GB, while still using the Carrierwave gem. Are there any other gems or branches of Carrierwave or Fog that can handle the 5GB+ file uploads to S3?

Edit: I'd prefer not to have to rewrite a complete Rails uploading solution, so links like this won't help: https://gist.github.com/908875 .

Upvotes: 5

Views: 2603

Answers (3)

lee
lee

Reputation: 2439

I figured out how to do this and have it working now. In the proper config/environment file, add the following to send files in 100MB chunks to Amazon S3:

CarrierWave.configure do |config|
  config.fog_attributes = { :multipart_chunk_size => 104857600 }
end

Since the fog gem has multipart uploads built in (thanks to Veraticus for pointing it out), the appropriate configuration attributes just need to be passed into fog via Carrierwave. When sending to S3 I received frequent Connection reset by peer (Errno::ECONNRESET) errors, so parts of the upload may have to be retried.

Upvotes: 7

Veraticus
Veraticus

Reputation: 16064

You want to use S3's multipart upload functionality. Helpfully, Fog can indeed handle multipart S3 uploads, as you can see in this pull request.

Unfortunately, Carrierwave does not seem to have the functionality built in to use it correctly. So you'd need to either modify Carrierwave or drop into Fog manually to correctly upload this file.

Upvotes: 6

Dayan
Dayan

Reputation: 8031

You will need to break your file into small pieces prior to uploading.

Take a look at the following:

http://www.ruby-forum.com/topic/1282369

http://joemiller.me/2011/02/18/client-support-for-amazon-s3-multipart-uploads-files-5gb/

Either way, you need to split the file.

Upvotes: -2

Related Questions