Reputation: 909
In an attempt to install S3 with Heroku, my app has crashed.
What I want to do is have Carrierwave upload files to the S3 storage, and for rails to load the assets from S3 storage. I opened my S3 account and within my app bucket, I uploaded the entire Assets folder with directory tree ass follow:
Here are the steps I followed, reading the guide Heroku: Using AWS S3 to Store Static Assets and File Uploads and Example of setting up S3 with Carrierwave:
In my Gemfile I added
gem 'fog'
I ran the commands:
heroku config:add AWS_ACCESS_KEY_ID=XXXXXX AWS_SECRET_ACCESS_KEY=XXXXXX
heroku config:add S3_BUCKET_NAME=myapp
heroku config:add S3_REGION=ap-southeast-1 # I created my bucket in Singapore
heroku config:add S3_ASSET_URL=https://s3-ap-southeast-1.amazonaws.com/myapp/assets_%24folder%24
Then ran the bundle install
Then I created config/initializers/carrierwave.rb
# config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
# Configuration for Amazon S3 should be made available through an Environment variable.
# For local installations, export the env variable through the shell OR
# if using Passenger, set an Apache environment variable.
#
# In Heroku, follow http://devcenter.heroku.com/articles/config-vars
#
# $ heroku config:add S3_KEY=your_s3_access_key S3_SECRET=your_s3_secret S3_REGION=eu-west-1 S3_ASSET_URL=http://assets.example.com/ S3_BUCKET_NAME=s3_bucket/folder
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'],
:region => ENV['S3_REGION']
}
# For testing, upload files to local `tmp` folder.
if Rails.env.test? || Rails.env.cucumber?
config.storage = :file
config.enable_processing = false
config.root = "#{Rails.root}/tmp"
else
config.storage = :fog
end
config.cache_dir = "#{Rails.root}/tmp/uploads" # To let CarrierWave work on heroku
config.fog_directory = ENV['S3_BUCKET_NAME']
config.s3_access_policy = :public_read # Generate http:// urls. Defaults to :authenticated_read (https://)
config.fog_host = "#{ENV['S3_ASSET_URL']}/#{ENV['S3_BUCKET_NAME']}"
end
Afterwhich I updated my git and pushed on heroku:
git add .
git commit -m "added S3 configs with fog"
git push heroku master
When I went on my Heroku app, I realized that there is an error and I checked my logs with the following errors:
...
2013-01-20T11:00:17+00:00 heroku[web.1]: Process exited with status 1
2013-01-20T11:00:17+00:00 heroku[web.1]: State changed from starting to crashed
2013-01-20T11:00:17+00:00 heroku[web.1]: State changed from crashed to starting
...
2013-01-20T11:00:52+00:00 app[web.1]: /app/config/initializers/carrierwave.rb:29:in `block in <top (required)>': undefined method `s3_access_policy=' for CarrierWave::Uploader::Base:Class (NoMethodError)
...
2013-01-20T11:00:53+00:00 heroku[web.1]: Process exited with status 1
2013-01-20T11:00:53+00:00 heroku[web.1]: State changed from starting to crashed
2013-01-20T11:04:52+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path=/ host=myapp.herokuapp.com fwd=xxx.xxx.xx.x dyno= queue= wait= connect= service= status=503 bytes=
2013-01-20T11:04:54+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path=/favicon.ico host=myapp.herokuapp.com fwd=xxx.xxx.xx.x dyno= queue= wait= connect= service= status=503 bytes=
2013-01-20T11:04:55+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path=/favicon.ico host=myapp.herokuapp.com fwd=xxx.xxx.xx.x dyno= queue= wait= connect= service= status=503 bytes=
I also tried to run the heroku run rake db:migrate
and got an error:
rake aborted!
undefined method `s3_access_policy=' for CarrierWave::Uploader::Base:Class
Also, in my views, what URL should I put for the static assets?
Thank you for any wise advice
Aurelien
Upvotes: 2
Views: 3263
Reputation: 909
Here is how I solved the problem:
I commented out the methods: config.s3_access_policy
and config.fog_host
and it worked. Here is my final initializer with an extra conditional check to use fog the rails file system when uploading based on the Rails environment. Thank you FrontierPsycho, you are right, the credentials are the minimum pre-requesits.
# config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
# Configuration for Amazon S3 should be made available through an Environment variable.
# For local installations, export the env variable through the shell OR
# if using Passenger, set an Apache environment variable.
#
# In Heroku, follow http://devcenter.heroku.com/articles/config-vars
#
# $ heroku config:add S3_KEY=your_s3_access_key S3_SECRET=your_s3_secret S3_REGION=eu-west-1 S3_ASSET_URL=http://assets.example.com/ S3_BUCKET_NAME=s3_bucket/folder
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'],
:region => ENV['S3_REGION']
}
# For testing, upload files to local `tmp` folder.
if Rails.env.test? || Rails.env.cucumber?
config.storage = :file
config.enable_processing = false
config.root = "#{Rails.root}/tmp"
elsif Rails.env.development?
config.storage = :file
else
config.storage = :fog
end
config.cache_dir = "#{Rails.root}/tmp/uploads" # To let CarrierWave work on heroku
config.fog_directory = ENV['S3_BUCKET_NAME']
# config.s3_access_policy = :public_read # Generate http:// urls. Defaults to :authenticated_read (https://)
# config.fog_host = "#{ENV['S3_ASSET_URL']}/#{ENV['S3_BUCKET_NAME']}"
end
Upvotes: 2
Reputation: 743
I am not sure for this, but I seem to have followed the same guide as you, and it must be outdated. The CarrierWave Uploader API seems to have changed. Now, uploaded images are public by default, which you can change via the config.fog_public configuration option.
Look here and here for more info.
I ended up with only:
...
:provider => 'AWS',
:aws_access_key_id => ENV['S3_KEY'],
:aws_secret_access_key => ENV['S3_SECRET'],
:region => ENV['S3_REGION']
...
In my fog initializer. Nothing more was needed.
Upvotes: 1