3

Take the following Bash script 3-output-writer.sh:

echo A >&1
echo B >&2
echo C >&3

Of course when ran as . 3-output-writer.sh it gets the error 3: Bad file descriptor, because Bash doesn't know what to do with the 3rd output stream. One can easily . 3-output-writer.sh 3>file.txt, though, and Bash is made happy.

But here's the question: how do I pipe all these into another process, so that it would have all three to work with? Is there any way other than creating three named pipes, as in,

mkfifo pipe1 pipe2 pipe3  # prepare pipes
. 3-output-writer.sh 1>pipe1 2>pipe2 3>pipe3 &  # background the writer, awaiting readers
3-input-reader pipe1 pipe2 pipe3  # some sort of reader, careful not to block
rm pipe1 pipe2 pipe3

?

  • Can you clarify the question? The title is asking about piping (`|`), but the prose is about redirecting (`>`). Are you sure you want to redirect Stanard Output of one process, Standard Error of a second, and then (presumably) the Standard Input of a third? `>&1` may not do what you think it does based on what I am inferring from the question. You aslo ask about directing all three streams to **a** process, singlular (emplhasis mine), and not a file or set of files as demonstrated. – DopeGhoti Aug 06 '20 at 19:59
  • @DopeGhoti You seem to be reading the quesiton in a was that assumes that the OP is a noob. And twisting the meaning to be them asking how to do basic redirection. However I read it that they are a power-user, trying to do something more complex. – ctrl-alt-delor Aug 06 '20 at 20:04
  • 1
    I make no assumptions, I am unclear as to what the actual goal is because the title and question are so inconsistent in what they are discussing, and it is not clear what the actual desired end result actually is. – DopeGhoti Aug 06 '20 at 20:05
  • @DopeGhoti, seems to me they're using `>&1` right, it's in the script that produces output to all three file handles. And isn't there just one process they're redirecting to, the `3-input-reader`? – ilkkachu Aug 06 '20 at 20:18
  • Does "write a Perl script to do it" count as any other way? Because I don't think the shell will have a way to do that, but Perl would let you call the `pipe()` syscall and shuffle the resulting filehandles to child processes. Not that I still get what the point of this exercise is :) – ilkkachu Aug 06 '20 at 20:19
  • it's all fine you just need to explicitly read the fifos like you would do it with regular files – alecxs Aug 06 '20 at 20:24
  • 1
    I was thinking there could be some way to "multi-pipe" one process to another, that is, connect several outputs of one into several inputs of another. But it seems clearer to me now that while a process can have several outputs, there is no mechanism for operating on several inputs other than some sort of external plumbing (like named FIFOs, temp files or buffers of some sort) and careful reading from them (so as not to block). While the former is easy in a shell, the latter might require a more flexible language. – Sinus the Tentacular Aug 07 '20 at 15:19
  • Bash has a reliable non-blocking read: `IFS='' read -u "${fd}" -t 0 -r txt` polls an fd (which may be stdin or a named pipe) and returns a status code. Changing to `-t 0.05` gets data if available, or times out. – Paul_Pedant Aug 07 '20 at 17:36
  • I understood your question to mean you want this: `3-output-writer.sh 2>&1 3>&1 | other-program` –  Aug 10 '20 at 08:24
  • @alecxs -- oops; looks like I did not read question carefully. Thanks for catching that –  Aug 14 '20 at 07:59

0 Answers0