Reputation: 938
I have a directory that contains lots of files and sub directories that I want to compress and export from hdfs to fs.
I came across this question - Hadoop: compress file in HDFS? , but it seems like it's relevant only to files, and using hadoop-streaming and the GzipCodec gave me no success with directories.
What is the most efficient why to compress HDFS folder into single gzip file?
Thanks in advance.
Upvotes: 3
Views: 8282
Reputation: 938
For a quick, dirty solution, for those of you who don't want to use hadoop-streaming or any MapReduce job for it, I used FUSE and then preform actions on it as traditional filesystem.
Pay attention that you might don't want to use this as a permanent solution, only for a quick win :)
Further reading:
* https://hadoop.apache.org/docs/r1.2.1/streaming.html
* http://www.javased.com/index.php?api=org.apache.hadoop.io.compress.GzipCodec
Upvotes: -1
Reputation: 112239
You will need a library or roll your own code to make a tar stream out of files in a directory structure. You can use zlib to compress the tar stream to make a standard .tar.gz file.
The two tidbits I can provide here if you want to merge the results of multiple such tasks are: 1) you can concatenate gzip streams to make valid gzip streams, and 2) you can concatenate tar streams to make a valid tar stream if you remove the final 1024 zero bytes from the non-final tar streams.
Upvotes: 1