1

I am using xargs in conjunction with cut, but I am unsure how to get the output of cut to a variable that I can pipe to use for further processing.

So, I have a text file like so:

test.txt:

/some/path/to/dir,filename.jpg
/some/path/to/dir2,filename2.jpg
...

I do this:

cat test.txt | xargs -L1 | cut -d, -f 1,2
/some/path/to/dir,filename.jpg

but what Id like to do is:

cat test.txt | xargs -L1 | cut -d, -f 1,2 | echo $1 $2

where $1 and $2 are /some/path/to/dir and filename.jpg.

I am stumped that I cannot seem to able to achieve this.

What I want to achieve is:

read_somehow_the text_fields_from text_file |  ./mypgm -i $1 -o $2
Kusalananda
  • 320,670
  • 36
  • 633
  • 936
JohnJ
  • 135
  • 1
  • 1
  • 6
  • `xargs -L1` is not really doing anything for you here since `cut` operates line-by-line anyway. How about something like `while IFS=, read x y; do echo "\$x is $x and \$y is $y"; done < test.txt` instead? – steeldriver Mar 15 '22 at 12:44
  • What are you trying to achieve? To split fields by spaces instead of commas can be done by `tr`, or just the simple `cut -f1,2 -d, --output-delimiter ' ' file` will be sufficient. – White Owl Mar 15 '22 at 12:47
  • Does this answer your question? [Passing multiple parameters via xargs](https://unix.stackexchange.com/questions/387076/passing-multiple-parameters-via-xargs) – G-Man Says 'Reinstate Monica' Mar 15 '22 at 21:32

1 Answers1

2

GNU parallel has more options than xargs for reading columnar data; you could use

$ parallel --colsep , echo ./mypgm -i {1} -o {2} :::: test.txt
./mypgm -i /some/path/to/dir -o filename.jpg
./mypgm -i /some/path/to/dir2 -o filename2.jpg

Alternatively, using a shell loop:

$ while IFS=, read -r x y rest_if_any_ignored; do echo ./mypgm -i "$x" -o "$y"; done < test.txt
./mypgm -i /some/path/to/dir -o filename.jpg
./mypgm -i /some/path/to/dir2 -o filename2.jpg

Remove the echo once you are satisfied that it is doing the right thing.

Stéphane Chazelas
  • 522,931
  • 91
  • 1,010
  • 1,501
steeldriver
  • 78,509
  • 12
  • 109
  • 152
  • very interesting - because I was originally running parallel wih two max jobs using the --jobs param and I assumed using one job with parallel is impossible. thank you for the soluruon! – JohnJ Mar 15 '22 at 18:24