0

My first naive way would be writing the output (python script) into a file and send it via scp but given that there output is generated overtime (like inconstantly between 1s and 1 day periods) I don't think that opening a file, writing it and then sending it is the smartest way.

Is there a better way? It doesn't have to be ssh but something that does not require port-forwarding or UDP holepunching.

topkek
  • 61
  • 6
  • 5
    Does `myprogram | ssh -o ServerAliveInterval=300 me@remotecomputer "cat > file"` do what you want? – Mark Plotnick Jan 05 '21 at 15:22
  • More details on @MarkPlotnick 's answer [here](https://unix.stackexchange.com/questions/3026/what-options-serveraliveinterval-and-clientaliveinterval-in-sshd-config-exac) – FelixJN Jan 05 '21 at 15:26
  • what i would to given that it can take a long time was to write the output to a file and then use SSHFS to mount it on the local machine and monitor the content. – BANJOSA Jan 05 '21 at 16:27

1 Answers1

1

Pipes

This is not an ssh problem. Read up on shell pipe-lines. How standard-out can be piped to a 2nd processes standard-in. see https://www.youtube.com/watch?v=bKzonnwoR2I for an introduction.

If you already know this then.

Consider ssh remote-machine remote-command to be equivalent to local-command. e.g. ssh remote-machine ls gets a directory list on the remote. ls | less puts a directory listing through a pager. So ssh remote-machine ls | less will put a remote directory list through a pager.

ctrl-alt-delor
  • 27,473
  • 9
  • 58
  • 102