Currently using following model, but one needs double the amount of disk space to restore compressed archive given one has to pipe all parts to tar before one can delete them.
$ COPYFILE_DISABLE=true tar \
--create \
--directory ~/data/dataset \
--use-compress-program lz4 \
--verbose \
. | \
split \
--bytes 10G \
--numeric-suffixes \
- \
dataset.tar.lz4.part
$ cat dataset.tar.lz4.part* | \
tar \
--extract \
--directory ~/data/dataset \
--use-compress-program lz4 \
--verbose
Is there a more efficient model where parts can be deleted FIFO (first in first out) as they are decompressed?