Reputation: 1006
From the link, https://www.digitalocean.com/community/questions/how-to-upload-an-object-to-digital-ocean-spaces-using-python-boto3-library. It only states to upload files to the spaces.
I want to upload a folder to the spaces.
import boto3
session = boto3.session.Session()
client = session.client('s3',
region_name='nyc3',
endpoint_url='https://nyc3.digitaloceanspaces.com',
aws_access_key_id='ACCESS_KEY',
aws_secret_access_key='SECRET_KEY')
client.upload_file('/path/to/file.ext', # Path to local file
'my-space', # Name of Space
'file.ext') # Name for remote file
This only uploads file. How to upload folder or directory from this process?
Upvotes: 3
Views: 2861
Reputation: 1
The folders can be pushed to spaces using s3cmd. We just need to configure the cli. (LINUX)
1)The first step is to install s3cmd :- sudo apt update sudo apt install s3cmd
2)Second step is configure s3cmd --configure
*Access Key: EXAMPLE7UQOTHDTF3GK4 *Secret Key: exampleb8e1ec97b97bff326955375c5
8)When using secure HTTPS protocol all communication with Amazon S3 servers is protected from 3rd party eavesdropping. Use HTTPS protocol [Yes]: Yes
9)If your network requires one, enter its IP address or domain name without the protocol, for example, 203.0.113.1 or proxy.example.com Try setting it here if you can't connect to S3 directly (Optional) HTTP Proxy server name:
Upvotes: 0
Reputation: 833
s3cmd will do the job.
It allows multiple file upload, uploading in chuncks and even sync folder with s3 folder
https://docs.digitalocean.com/products/spaces/reference/s3cmd/
Upvotes: -2
Reputation: 238877
You do it the same as with S3, which is to iterate over the files in the folder and upload all files as you iterate over them using your upload_file
.
Only AWS CLI has high level function to upload folders. boto3
can upload only individual files.
Upvotes: 2