5

I'm finding myself in a terminal more and more often these days as I learn to do certain types of things quicker or more conveniently.

However, when it comes to copying a large amount of data (i.e. hundreds of gigabytes) from one HDD to another, I always revert to the GUI (Nautilus or Finder in my case; the file systems are ext4 or HFS+).

What I have in mind is the initial copying of data to a new larger HDD that's replacing an older one, or to an external back-up HDD.

Are there any tangible benefits to be had using terminal commands in this setting? If so, what are they?

EDIT

Sometimes with these large GUI copies it'll get tripped up somewhere along the way due to a corrupt file or for some other reason. I guess I was wondering if terminal commands, rather than the GUI method, can avoid this problem. It's often quite difficult to determine where the GUI copy has got to, where to resume, and which files are causing the issues.

To my eyes at least, these copies seem a little bit random as to where they start and end.

boehj
  • 2,580
  • 4
  • 23
  • 22
  • You're __fully__ right. Those GUI tools often *do* have a __tendency to choke__ at large amounts of small files or a few extremely large ones. Please see my question also where I documented a variant of the latter behavior in full detail: http://unix.stackexchange.com/questions/92472/out-of-memory-error-while-copying-large-files-with-pcmanfm . Plus, I filed a bug report to `pcmanfm` which showed that rather the method `g_file_copy()` must be at fault somehow: https://sourceforge.net/p/pcmanfm/bugs/917/ Needless to say that said method is also used by `nautilus` or `thunar`. – syntaxerror Nov 21 '14 at 07:41

2 Answers2

5

I don't really see a difference between copying many files and other tasks, usually what makes the command line more attractive is

  • simple tasks which are trivial enough for you to do on the command line, so that using the GUI would be a waste of time (faster to type a few characters than click in menus, if you know what characters to type);
  • very complex tasks which the GUI just isn't capable of doing.

There's another benefit I see to the command line in one very specific circumstance. If you're performing a very long operation, like copying many files, and you may want to check the progress while logged into your machine remotely, then it's convenient to see the task's progress screen. Then it's convenient to run the task in a terminal multiplexer like Screen or Tmux. Start Screen, start the task inside Screen, then later connect to your machine with SSH and attach to that Screen session.

Gilles 'SO- stop being evil'
  • 807,993
  • 194
  • 1,674
  • 2,175
  • Thanks Gilles. Screen looks quite interesting. I'll put a bit of supplementary info in the question. – boehj May 04 '11 at 01:29
4

You might find that it is awkward to use the command line for moving large amounts of files from one directory or drive to another because you're not using the right tools. Something like rsync or rsnapshot is generally the preferred way to do this. I have a little homebrew bash script that uses rsync to do exactly this and it works quite well, much better than the available GUI options of point-click-and-drag the directory in my opinion.

0 _
  • 103
  • 1
  • 8
  • I gave `rsync` a thought but wondered if it was of any benefit when not dealing with 'partial copies'. I'll take another look. Cheers. – boehj May 04 '11 at 05:39
  • 1
    @boehj: rsync has the advantage that if it's interrupted for any reason (network problem for a network copy, power failure, …), you can launch it again and it'll pretty much start again where it left out. – Gilles 'SO- stop being evil' May 04 '11 at 07:19
  • @Gilles - That's just what I want to hear. That makes complete sense now that I've given it a little extra moment's thought. Nothing to see here. :) Cheers. – boehj May 04 '11 at 07:38