I'm writing a bash script that needs to fetch all *_out.csv from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/. I can find all my files by going:
find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"
This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:
$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .
How would I grab the files I need from there?
The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.