Reputation: 464
So I have seen similar questions that answer part of my question, but not to the specific level I'm needing. Here is the overall gist of what I'm trying to do.
I am migrating a bunch of websites to a new server. Currently, they are all located in /srv/www/
Each site has its own folder in that directory name after the domain such as /srv/www/domain1.com
, /srv/www/domain2.com
.
And in each domain.com
directory there is the public web directory named html
.
What I want to do is cd into /srv/www
then run a command or bash script that will recurrsivly go into each domain.com/html
directory and zip the contents of that directory, then output it as a zip name domain.com.zip
in /srv/web_backup/domain.com.zip
. Again, I only want the contents. When I unzip the file, I don't want it to output into html/{contents}
.
In this case, domain.com
is an example, but each folder will have its own name.
If I were manually doing this, once I cd into /srv/www/domain.com/html
I would run the command zip -r /srv/web_backup/domain.com.zip . .htaccess
and that is what I want to run to get all files, directories, and the .htaccess file.
So to summarize, I want to individually zip the contents, not the directory, of every /srv/www/{domain.com}/html
and have them named {domain.com}.zip
(the parent directory of the html
folder) and then save those zip files to /srv/web_backup/domain.com.zip
I appreciate any help as I'm not too humble to admit this is above my Linux knowledge or ability.
Upvotes: 0
Views: 1005
Reputation: 311615
I think you've mostly described the steps. You just need to put them into a script. Something like:
#!/bin/bash
cd /srv/www
for domain in *; do
(
cd $domain/html &&
zip -r /srv/web_backup/$domain.zip . .htaccess
)
done
Breaking that down a bit:
# change into the /srv/www directory
cd /srv/www
# for each domain in this directory (we're largely assuming here that
# this directory contains only directories representing domains you
# want to backup).
for domain in *; do
# a (...) expression runs the included commands in a subshell, so
# that when we exit we're back in /srv/www.
(
# change into the html directory and, if that was successful, update
# your zip file.
cd $domain/html &&
zip -r /srv/web_backup/$domain.zip .
)
done
This is only one way of tackling this particular problem. You could also write something like:
#!/bin/sh
find /srv/www/* -maxdepth 0 -type d -print0 |
xargs -0 -iDOMAIN sh -c 'cd "DOMAIN/html" && zip -r "/srv/www/DOMAIN.zip" .
That does roughly the same thing, but makes clever use of find
and xargs
.
Upvotes: 1