Ansgar Schmidt
Ansgar Schmidt

Reputation: 169

How can I backup volumes in the IBM Docker Cloud in Bluemix?

I know how to save docker images when I can access them locally, but how can I backup a volume container in the IBM Bluemix platform?

Thanks Ansi

Upvotes: 1

Views: 582

Answers (2)

MouIdri
MouIdri

Reputation: 1390

2 possibilities I found so far :

A - First possibility, using Some nice tool call cloudberry backup for linux : http://www.cloudberrylab.com/backupcmd_nix.aspx#section-synchronise-account

1- First, install :

dpkg -i ubuntu14_CloudBerryLab_CloudBerryBackup_v1.10.0.112_20161110193906.deb

2- license :

cbb activateLicense -e "[email protected]" -t

3- add object account

cbb addAccount -d testCOS -st S3Compatible -ac wfkNMxTXjE1wRlCDYW9A -sk UMsSQfQGsgbhnKIieYgNAwOh218FUlvCfnpFlV5k -ep http://myendpoint -c demovaultbucket -bp testbkpclientcloudfoundry

For example, you are going to:

backup all the files from the "/home/NAME/Documents/" directory;
exclude the "/home/NAME/Documents/books/" folder;
use compression;
run every workday at 23:00;
receive notification on completion:

4- add plan for your backup

cbb addBackupPlan -n "Backup my docs" -a "testCOS" -f "/home/NAME/Documents/" -ef "/home/NAME/Documents/books/" -c yes -every week -at "23:00" -weekday "mo, tu, we, th, fr" -notification on

5- add an other plan for your backup

cbb addBackupPlan -n "Backup my docs 2" -a "testCOS" -f "/root" yes -every week -at "13:00" -weekday "mo, tu, we, th, fr"

root@bluemix:~# cbb plan  -l
CloudBerry Backup Command Line Interface started
Trial expires in 15 day(s)
Backup my docs 2 : Stopped
Backup my docs 3 : Stopped
Backup my docs   : Stopped
Backup my docs   : Stopped

5- run it

root@bluemix:~# cbb plan -free -r "Backup my docs 3"
CloudBerry Backup Command Line Interface started
Trial expires in 15 day(s)
Success

B - using registry storage on Bluemix S3 compatible storage, you will need to create you object storage account and some buckets in bluemix ( here http://www.softlayer.com/object-storage) :

You will receive a key and pass for every object storage account, then in this example we create 2 buckets. By the way, the endpoint is up to you, in my case, i used the less expensive one in dallas.

key : XXXXXXXXXXXXXX PASS: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

My 2 bucket are : demodockerbucket ( used for test ) demodockerbucket_fixed ( to store my container when the config suits my needs)

docker run -d -p 5000:5000 -e "REGISTRY_STORAGE=s3" -e "REGISTRY_STORAGE_S3_REGION=generic" -e "REGISTRY_STORAGE_S3_REGIONENDPOINT=https://s3-api.dal-us-geo.objectstorage.softlayer.net/" -e "REGISTRY_STORAGE_S3_BUCKET=demodockerbucket" -e "REGISTRY_STORAGE_S3_ACCESSKEY=XXXXXXXXXXXXXX" -e "REGISTRY_STORAGE_S3_SECRETKEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" -e "REGISTRY_STORAGE_S3_SECURE=true" -e "REGISTRY_STORAGE_S3_ENCRYPT=false" registry

Then play with your container pull it to your object storage or get it back :

docker pull ubuntu
docker pull sameersbn/openfire:latest
docker pull debian
docker tag ubuntu localhost:5000/ubuntu
docker tag sameersbn/openfire localhost:5000/openfire
docker tag debian localhost:5000/debian
docker push localhost:5000/openfire
docker push localhost:5000/ubuntu
docker push localhost:5000/debian
docker pull localhost:5000/openfire
docker pull localhost:5000/ubuntu
docker pull localhost:5000/debian
docker run -d -p 5000:5000 -e "REGISTRY_STORAGE=s3" -e "REGISTRY_STORAGE_S3_REGION=generic" -e "REGISTRY_STORAGE_S3_REGIONENDPOINT=https://s3-api.dal-us-geo.objectstorage.softlayer.net/" -e "REGISTRY_STORAGE_S3_BUCKET=demodockerbucket_fixed" -e "REGISTRY_STORAGE_S3_ACCESSKEY=XXXXXXXXXXXXXX" -e "REGISTRY_STORAGE_S3_SECRETKEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" -e "REGISTRY_STORAGE_S3_SECURE=true" -e "REGISTRY_STORAGE_S3_ENCRYPT=false" registry
docker run --name='openfire' -i -t --rm -p 192.168.56.108:9090:9090 -p 192.168.56.108:5222:5222 -p 192.168.56.108:7777:7777 -p 192.168.56.108:5275:5275 localhost:5000/openfire

Upvotes: 1

v.bontempi
v.bontempi

Reputation: 1562

the simplest way to backup a (remote) container's volume is to mount the volume on another volume and tar it: once the tar is completed you could download it using scp/sftp/ftp/http or the service you prefer to connect to the container (also according to the services available on it).

To mount the volume on another container you could use --volumes-from flag to create a new container mounting this volume:

docker run --volumes-from [source container] -v /volume_backup ubuntu tar cvf /volume_backup/backup.tar /path_to_backup

This command launches a new container and mounts the volume from the [source container] container using the same path /path_to_backup. Then here a new volume is created and mount on /backup path.

Finally a tar is launched to tar the content of /path_to_backup volume to a backup.tar file inside the /backup directory.

When the command finishes, even if the container has been stopped, the backup is contained in the other volume: you could mount this volume in another container to download it, or to push/pull/upload or to whatever you want.

This backup could also be simply restored exploding the tar in the /path_to_backup path of the first container.

Otherwise you could use this ready container useful for backup: https://github.com/docker-infra/docker-backup

Here you can find docker docs to manage docker volumes: the only difference is that you should think about a way to move/copy the backup on your local environment or wherever you wish to keep your volumes backups

http://docs.docker.com/v1.8/userguide/dockervolumes/

Upvotes: 4

Related Questions