Reputation: 585
I know there is a way to upload to S3 directly from the web browser using POST without the files going to your backend server. But is there a way to do it from URL instead of web browser.
Example, upload a file that resides at http://example.com/dude.jpg directly to S3 using post. I mean I don't want to download the asset to my server then upload it to S3. I just want to make a POST request to S3 and it uploads it automatically.
Upvotes: 47
Views: 44378
Reputation: 2213
I use this Python AWS lambda function quickly written 5 years ago :
import boto3
import botocore.vendored.requests.packages.urllib3 as urllib3
def lambda_handler(event, context):
url= event['url']
bucket = 'your-bucket'
key = event['filename']
s3=boto3.client('s3')
http=urllib3.PoolManager()
s3.upload_fileobj(http.request('GET', url,preload_content=False), bucket, key)
Upvotes: 1
Reputation: 8122
If you are able you can use Cloudinary as an alternative to S3. They support remote upload via URL and more.
https://cloudinary.com/documentation/image_upload_api_reference#upload_examples
Upvotes: 0
Reputation: 2299
You can use rclone to achieve this easily: https://rclone.org/commands/rclone_copyurl/
Create a new access key on AWS for rclone and use rclone config like this: https://rclone.org/s3/
Then, you can easily interact with your S3 buckets using rclone.
To upload from URL:
rclone -Pva copy {URL} RCLONE_CONFIG_NAME:/{BUCKET_NAME}/{FOLDER}/
It is quite handy for me as I am archiving my old files from Dropbox Business to S3 Glacier Deep Archive to save on Dropbox costs.
I can easily create a file transfer from Dropbox (100GB per file limit), copy the download link and upload directly to S3 using rclone.
It is copying at 10-12 MiB/s on a small DigitalOcean droplet.
Upvotes: 1
Reputation: 4801
I thought I should share my code to achieve something similar. I was working on the backend but possibly could do something similar in frontend though be mindful about AWS credentials likely to be exposed.
For my purposes, I wanted to download a file from the external URL and then ultimately get back the URL form S3 of the uploaded file instead.
I also used axios
in order to get the uploadable format and file-type
to get the proper type of the file but that is not the requirement.
Below is the snippet of my code:
async function uploadAttachmentToS3(type, buffer) {
var params = {
//file name you can get from URL or in any other way, you could then pass it as parameter to the function for example if necessary
Key : 'yourfolder/directory/filename',
Body : buffer,
Bucket : BUCKET_NAME,
ContentType : type,
ACL: 'public-read' //becomes a public URL
}
//notice use of the upload function, not the putObject function
return s3.upload(params).promise().then((response) => {
return response.Location
}, (err) => {
return {type: 'error', err: err}
})
}
async function downloadAttachment(url) {
return axios.get(url, {
responseType: 'arraybuffer'
})
.then(response => {
const buffer = Buffer.from(response.data, 'base64');
return (async () => {
let type = (await FileType.fromBuffer(buffer)).mime
return uploadAttachmentToS3(type, buffer)
})();
})
.catch(err => {
return {type: 'error', err: err}
});
}
let myS3Url = await downloadAttachment(url)
I hope it helps people who still struggle with similar issues. Good luck!
Upvotes: 6
Reputation: 22417
It sounds like you want S3 itself to download the file from a remote server where you only pass the URL of the resource to S3.
This is not currently supported by S3.
It needs an API client to actually transfer the content of the object to S3.
Upvotes: 44
Reputation: 3680
I found this article with some details. You will probably have to modify your buckets' security settings in some fashion to allow this type of interaction.
http://aws.amazon.com/articles/1434
There will be some security issues on the client as well since you never want your keys publicly accessible
Upvotes: -1