Reputation: 33
I want to move my assets folder to Amazon S3 and since it has a big size, during the transaction i need to upload files both in my local storage and amazon s3 through paperclip.
Is there a way to configure paperclip to store uploaded files both on filesystem and amazon s3?
Upvotes: 1
Views: 595
Reputation: 76784
Maybe you'd benefit from this:
What you'll have to do is firstly upload to your local storage, and then "asynchronously" upload to S3
This is typically done through the likes of Resque
or DelayedJob
(as the tutorial demonstrates), and will require you to run some sort of third-party processing engine on your server (typically Redis or similar)
From the tutorial:
### Models ###
class Person < ActiveRecord::Base
has_attached_file :local_image,
path: ":rails_root/public/system/:attachment/:id/:style/:basename.:extension",
url: "/system/:attachment/:id/:style/:basename.:extension"
has_attached_file :image,
styles: {large: '500x500#', medium: '200x200#', small: '70x70#'},
convert_options: {all: '-strip'},
storage: :s3,
s3_credentials: "#{Rails.root}/config/s3.yml",
s3_permissions: :private,
s3_host_name: 's3-eu-west-1.amazonaws.com',
s3_headers: {'Expires' => 1.year.from_now.httpdate,
'Content-Disposition' => 'attachment'},
path: "images/:id/:style/:filename"
after_save :queue_upload_to_s3
def queue_upload_to_s3
Delayed::Job.enqueue ImageJob.new(id) if local_image? && local_image_updated_at_changed?
end
def upload_to_s3
self.image = local_image.to_file
save!
end
end
class ImageJob < Struct.new(:image_id)
def perform
image = Image.find image_id
image.upload_to_s3
image.local_image.destroy
end
end
### Views ###
# app/views/people/edit.html.haml
# ...
= f.file_field :local_image
# app/views/people/show.html.haml
- if @person.image?
= image_tag @person.image.expiring_url(20, :small)
- else
= image_tag @person.local_image.url, size: '70x70'
Upvotes: 1