Arman
Arman

Reputation: 4636

Parallel tar with split for large folders

I am have really huge folder I would like to gzip and split them for archive:

#!/bin/bash
dir=$1
name=$2
size=32000m
tar -czf /dev/stdout ${dir} | split -a 5  -d -b $size - ${name}

Are there way to speed up this with gnu parallel? thanks.

Upvotes: 3

Views: 10441

Answers (1)

konsolebox
konsolebox

Reputation: 75548

It seems the best tool for parallel gzip compression is pigz. See the comparisons.

With it you can have a command like this:

tar -c "${dir}" | pigz -c | split -a 5 -d -b "${size}" - "${name}"

With its option -p you could also specify the number of threads to use (default is the number of online processors, or 8 if unknown). See pigz --help or man pigz for more info.

UPDATE

Using GNU parallel you could do something this:

contents=("$dir"/*)
outdir=/somewhere
parallel tar -cvpzf "${outdir}/{}.tar.gz" "$dir/{}" ::: "${contents[@]##*/}"

Upvotes: 5

Related Questions