hkaraoglu
hkaraoglu

Reputation: 1345

How can I archive a directory in Perl

I'm trying to archive a directory with all subdirectory and files. But the below code doesnt work properly. It just archived files and folders, not content of folders ! How can I solve this ?

my @files = glob( $BACKUP_PATH.'website/*' );
my $tar = Archive::Tar->new();
$tar->add_files(@files);

my $date = DateTime->now(time_zone=>"local");
$tar->write($BACKUP_PATH.$websiteFolderName."/".$date->dmy('-')."-".$websiteFolderName.".tar");

Upvotes: 4

Views: 391

Answers (2)

Borodin
Borodin

Reputation: 126722

I thought I'd write a bells-and-whistles version of this because I was so disappointed with the accepted solution. I thought I'd show you what was possible and what I believe is proper

I've made these changes

  • I'm using Time::Piece to get the date stamp for the file name. It's a core module, and all that is necessary to generate a %d-%m-%Y datestamp. DateTime is non-core, and a huge amount of code to load for such a simple purpose

    By the way I've used a ymd format instead of dmy. If you want it the way you wrote your own code then you can change it back, but ymd names are useful because they sort in date order

  • I've used catdir and catfile from File::Spec::Functions, the procedural interface to the core File::Spec library. It saves worrying about whether one variable has a slash at the end and another at the beginning, and makes for clearer code because it avoids all those nasty string concatenations

  • I've used make_path from File::Path to ensure that the destination directory actually exists

  • Since the paths to the files to be archived are stored in an array anyway there no reason not to use the create_archive class method to build and save your files all in one go

    The second parameter to create_archive is a compression method. I have set it to zero to request no compression and maximum speed, but you can enable gzip or bzip compression by passing COMPRESS_GZIP or COMPRESS_BZIP

    Although the module's documentation discourages it, you can also set this parameter to 1 through 9 to indicate a level of gzip compression. If you are archiving text files then I highly recommend that you use 1 or 2 here. Passing COMPRESS_GZIP results in a gzip compression level of 9, which is by far the slowest and can often result in a *larger* file than lower compressions. A value of 1 will often give you a file one third of the size of the uncompressed archive, while 2 may give you another five percent reduction. After that there is usually very little gain

  • I've added a few log messages to give confidence and inform about the progress of the program. I've also written a little bytes subroutine to convert the size of a file to the appropriate power of 1KB to make it more readable

I'm using Windows paths because that's the desk where I happen to be sat. Because of the use of File::Spec it should work fine on Linux as well once you have set the paths and file names at the top of the code +

Whether or not you use this, I hope you find it useful

use strict;
use warnings;

use Archive::Tar;
use File::Find qw/ find /;
use Time::Piece;
use File::Spec::Functions qw/ catfile catdir /;
use File::Path qw/ make_path /;

STDOUT->autoflush;

my $BACKUP_PATH         = 'E:/Perl';
my $website_folder_name = 'website';
my $website_backup_name = 'website_backup';

print "Making list of files to archive\n";

my $source_dir = catdir($BACKUP_PATH, $website_folder_name);
my @filelist;

find( sub {
    return unless -f;
    push @filelist, $File::Find::name;
}, $source_dir);

printf "List complete - %d files to be archived\n", scalar @filelist;

my $tardir = catdir($BACKUP_PATH, $website_backup_name);
make_path($tardir, { verbose => 1 } );

my $date = localtime()->ymd;
my $tarfile = catfile($tardir, "$date-$website_backup_name.tar");
print qq{Writing archive to "$tarfile"\n};

my $success = Archive::Tar->create_archive($tarfile, 0, @filelist);
printf "Archive written successfully - %s\n", bytes(-s $tarfile) if $success;


sub bytes {
    my ($size) = @_;

    my @prefix = ( '', qw/ K M G T P E / );

    my $pow = 0;
    while ( $size >= 1024 ) {
        $size /= 1024;
        ++$pow;
    }

    sprintf "%.0f%sB", $size, $prefix[$pow];
}

output

Making list of files to archive
List complete - 6602 files to be archived
Writing archive to "E:\Perl\website_backup\2016-01-07-website_backup.tar"
Archive written successfully - 7MB

Upvotes: 3

zb226
zb226

Reputation: 10500

glob (perldoc) does not recurse into subdirectories. You need to take care of that yourself or use some module like File::Find:

use Archive::Tar;
use File::Find qw(find);

my @files;
find( { wanted => sub { push @files, $File::Find::name } }, $BACKUP_PATH.'website');
my $tar = Archive::Tar->new();
$tar->add_files( @files );

my $date = DateTime->now(time_zone=>"local");
$tar->write($BACKUP_PATH.$websiteFolderName."/".$date->dmy('-')."-".$websiteFolderName.".tar");

Upvotes: 2

Related Questions