-1

I'd like to compress one huge file (~200GB) from the command line. no need for archiving its directory along with it, swift operation and high compression ratio preferred. What may be some good ways?

Matan
  • 560
  • 1
  • 7
  • 15
  • Depends on what is the contents of the file, whether speed or size is more important and other things. Other than that, you can use any command line utility like `gzip`, `bzip2`, `xz`, `lzma`, `zip`, `7z`, `rar`, ... there's plenty to choose from. – peterph Jan 15 '15 at 12:31
  • -1 for “best”. On the majority, there isn't a best way to do things under *nix. As @peterph mentioned, what works well varies greatly depending on what you actually want to happen. Without a lot more detail about what you want, there is no “best” option. – HalosGhost Jan 15 '15 at 12:47
  • 1
    About `mksquashfs`: [link](http://unix.stackexchange.com/a/123257/52934) and [link](http://unix.stackexchange.com/a/151057/52934). – mikeserv Jan 15 '15 at 12:48
  • 200 gb is a big file, i think a faster compress algorithm is better. you can compare some compress algorithms and choose a proper compress tool to do that. – zhangjie Jan 15 '15 at 12:44
  • 7z on an i7 took: real 147m3.783s user 243m1.971s sys 5m29.123s – Matan Jan 15 '15 at 15:03

1 Answers1

2

I would recommend to use 7z.

7Z is 7-Zip's archiving format, providing high compression ratio through powerful compression algorithms that can take benefit of parallel computing on modern multicore CPUs.

To make 7z archive you need p7zip-full package. To make archive run 7z a <you_archive_name>.7z <filename>

HalosGhost
  • 4,732
  • 10
  • 33
  • 41
kirill-a
  • 2,883
  • 1
  • 16
  • 22