Reputation: 1043
I'm using sitemap_generator to create sitemap. I have a rake task to create s sitemap and to upload it to s3.
sitemap.rb
SitemapGenerator::Sitemap.default_host = "https://www.ezpoisk.com"
SitemapGenerator::Sitemap.create_index = true
SitemapGenerator::Sitemap.public_path = 'tmp/'
SitemapGenerator::Sitemap.sitemaps_path = 'sitemaps/'
SitemapGenerator::Sitemap.create do
# generating links ...
rake task
require "aws"
namespace :sitemap do
desc "Upload the sitemap files to S3"
task upload_to_s3: :environment do
puts "Starting sitemap upload to S3..."
s3 = AWS::S3.new(access_key_id: ENV["AWS_ACCESS_KEY_ID"],
secret_access_key: ENV["AWS_SECRET_ACCESS_KEY"])
bucket = s3.buckets[ENV["S3_BUCKET_NAME"]]
Dir.entries(File.join(Rails.root, "tmp", "sitemaps")).each do |file_name|
next if ['.', '..', '.DS_Store'].include? file_name
path = "sitemaps/#{file_name}"
file = File.join(Rails.root, "tmp", "sitemaps", file_name)
begin
object = bucket.objects[path]
object.write(file: file)
rescue Exception => e
raise e
end
puts "Saved #{file_name} to S3"
end
end
desc 'Create the sitemap, then upload it to S3 and ping the search engines'
task create_upload_and_ping: :environment do
Rake::Task["sitemap:create"].invoke
Rake::Task["sitemap:upload_to_s3"].invoke
url = "https://www.ezpoisk.com/sitemaps/sitemap.xml.gz"
SitemapGenerator::Sitemap.ping_search_engines(url)
end
end
and I want to be able to serve if from s3 via my site so in routes
get "sitemaps/sitemap(:id).:format.:compression" => "sitemap#show"
and sitemaps_controller
def show
data = open("https://s3.amazonaws.com/#{ENV['S3_BUCKET_NAME']}/sitemaps/sitemap#{params[:id]}.xml.gz")
send_data data.read, :type => data.content_type
end
Now. The problem.
when I run rake task and try to access file via the link I get 403 forbidden. I then go to s3 console and manually do "Make it public" on "sitemaps" folder. Now when I try to access file it's properly downloaded... the problem is - that when I run the task again ( I have a sidekiq job that does it once a day) I get 403 again... My assumptions is my write operation changes permissions on this.
the bucket itself has "allow list to everyone" permission.
I tried
bucket = s3.buckets[ENV["S3_BUCKET_NAME"]]
bucket.acl = :public_read
in the rake task, but it doesn't seem to take affect. I'm missing something, there has to be either way to set a flag on write to make it public, or maybe, I don't initialize it properly.
Upvotes: 0
Views: 793
Reputation: 1043
Ok. it's pretty simple/obvious (as usually) in rake task it should be
object.write(file: file, acl: :public_read)
coutesy of https://www.codefellows.org/blog/tutorial-how-to-upload-files-using-the-aws-sdk-gem
Upvotes: 0