man pbzip2 (Commandes) - parallel bzip2 file compressor, v0.9.5

NAME

pbzip2 - parallel bzip2 file compressor, v0.9.5

SYNOPSIS

pbzip2 [ -123456789 ] [ -b#cdfkp#rtvV ] [ filenames ... ]

DESCRIPTION

pbzip2 is a parallel implementation of the bzip2 block-sorting file compressor that uses pthreads and achieves near-linear speedup on SMP machines. The output of this version is fully compatible with bzip2 v1.0.2 (ie: anything compressed with pbzip2 can be decompressed with bzip2).

pbzip2 should work on any system that has a pthreads compatible C++ compiler (such as gcc). It has been tested on: Linux, Windows (cygwin), Solaris, Tru64/OSF1, HP-UX, and Irix.

The default settings for pbzip2 will work well in most cases. The only switch you will likely need to use is -d to decompress files and -p to set the # of processors for pbzip2 to use if autodetect is not supported on your system, or you want to use a specific # of CPUs.

OPTIONS

-b#
Where # is the file block size in 100k (default 9 = 900k)
-c
Output to standard out (stdout)
-d
Decompress file
-f
Force, overwrite existing output file
-k
Keep input file, do not delete
-p#
Where # is the number of processors (default: autodetect)
-r
Read entire input file into RAM and split between processors
-t
Test compressed file integrity
-v
Verbose mode
-V
Display version info for pbzip2 then exit
-1..9
Set BWT block size to 100k .. 900k (default 900k)

FILE SIZES

You should be able to compress files larger than 4GB with pbzip2.

Files that are compressed with pbzip2 are broken up into pieces and each individual piece is compressed. This is how pbzip2 runs faster on multiple CPUs since the pieces can be compressed simultaneously. The final .bz2 file may be slightly larger than if it was compressed with the regular bzip2 program due to this file splitting (usually less than 0.2% larger). Files that are compressed with pbzip2 will also gain considerable speedup when decompressed using pbzip2.

Files that were compressed using bzip2 will not see speedup since bzip2 packages the data into a single chunk that cannot be split between processors. If you have a large file that was created with bzip2 (say 1.5GB for example) you will likely not be able to decompress the file with pbzip2 since pbzip2 will try to allocate 1.5GB of memory to decompress it, and that call might fail depending on your system resources. If the same 1.5GB file had of been compressed with pbzip2, it would decompress fine with pbzip2. If you are unable to decompress a file with pbzip2 due to its size, use the regular bzip2 instead.

EXAMPLES

Example 1: pbzip2 myfile.tar

This example will compress the file "myfile.tar" into the compressed file "myfile.tar.bz2". It will use the autodetected # of processors (or 2 processors if autodetect not supported) with the default file block size of 900k and default BWT block size of 900k.

Example 2: pbzip2 -b15k myfile.tar

This example will compress the file "myfile.tar" into the compressed file "myfile.tar.bz2". It will use the autodetected # of processors (or 2 processors if autodetect not supported) with a file block size of 1500k and a BWT block size of 900k. The file "myfile.tar" will not be deleted after compression is finished.

Example 3: pbzip2 -p4 -r -5 myfile.tar second*.txt

This example will compress the file "myfile.tar" into the compressed file "myfile.tar.bz2". It will use 4 processors with a BWT block size of 500k. The file block size will be the size of "myfile.tar" divided by 4 (# of processors) so that the data will be split evenly among each processor. This requires you have enough RAM for pbzip2 to read the entire file into memory for compression. Pbzip2 will then use the same options to compress all other files that match the wildcard "second*.txt" in that directory.

Example 4: pbzip2 -d myfile.tar.bz2

This example will decompress the file "myfile.tar.bz2" into the decompressed file "myfile.tar". It will use the autodetected # of processors (or 2 processors if autodetect not supported). The switches -b, -r, and -1..-9 are not valid for decompression.

SEE ALSO

AUTHOR

Jeff Gilchrist

http://compression.ca