I'm trying to copy a 100GB zip file via scp, but it keeps disconnecting due to the large file size.
What is the best solution to overcome this?
I'm trying to copy a 100GB zip file via scp, but it keeps disconnecting due to the large file size.
What is the best solution to overcome this?
Here are some options.
1. Use the -Cflag:
The -C flag enables compression during the file transfer.
scp -C /source/file.zip user@host:/destination/file.zip
2. Increase the SSH timeout on the server:
You can modify the ClientAliveInterval and ClientAliveCountMax options in the SSH server configuration file /etc/ssh/sshd_config on most systems.
3. Split the file into smaller parts:
Another approach is to split the large file into smaller parts using a file archiving tool like split or 7z.
4. Use rsync:
Rsync is a powerful tool for file synchronization and transfers. It has built-in resumable transfers and can efficiently transfer only the parts of the file that have changed. You can use rsync instead of SCP to copy the file and resume the transfer in case of interruptions.
scp differ from rsync?5. SFTP:
SFTP is an extension of SSH that provides secure file transfer capabilities. It can be more reliable than SCP for large file transfers.
6. SSHFS/FUSE | This is my recommendation:
I would use SSHFS/FUSE. With this, you can mount the directory from the target server directly into a directory on your own system, allowing you to work as if the target folder is present, local on the client.