Reputation: 11240
I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Upvotes: 86
Views: 82806
Reputation: 11495
Switch into the website folder, then run
zip -R foo.zip '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo.zip website_folder -i '*.php' '*.html' '*.js' '*.css'
Upvotes: 139
Reputation: 7740
You can use find
and grep
to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -@ test.zip
(-@
tells zip to read a file list from stdin)
Upvotes: 43
Reputation: 41858
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
Upvotes: 4
Reputation: 342433
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
Upvotes: 4
Reputation: 5927
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
Upvotes: 7
Reputation: 28167
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.
Upvotes: 1