Reputation: 2751
I am processing some video with ffmpeg and then firing the video up to S3 with the aws_s3 gem. I use the following code:
S3Object.store("testme.mp4", open(file), 'blah', :access => :public_read)
Everything works great but with files of 1GB and over I receive the following error:
"Timeout::Error: execution expired".
This only happens after ffmpeg has processed the file however. If I send the file on its own, without processing, then everything is fine.
Has anyone come across a similar issue?
Thanks,
SLothistype
Upvotes: 1
Views: 1864
Reputation: 1444
The modern way to set the :http_read_timeout
is while initializing Aws::S3::Client
.
# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/S3/Client.html#initialize-instance_method
s3_client = Aws::S3::Client.new(
region: region_name,
credentials: credentials,
http_read_timeout: 300,
)
# Instead of open(file) in code from the OP's code
File.open('/source/file/path', 'rb') do |file|
# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/S3/Client.html#put_object-instance_method
s3_client.put_object(bucket: 'bucket-name', key: 'testme.mp4', body: file)
end
Upvotes: 0
Reputation: 5106
I have run into this problem, and unfortunately had to monkey patch the AWS::S3::Connection::create_connection method, so I could increase the read_timeout.
If you implement the method yourself, you would set
http.read_timeout = 300 # or something else higher
I originally found this from Pivotal Labs, Inc. They are pretty well respected and basically were saying "this is not a great solution, but the aws_s3 gem doesn't expose anything better."
Upvotes: 5