On an HPC cluster I am trying to run multiple bash scripts (permute2.sh) from 1 bash script using GNU parallel, however it doesn't complete every job. It randomly completes one job, while it is stuck doing the other.
permute1.sh:
PROCS=144
permuations=1000
seq 1 $permuations | parallel -j $PROCS sh permute2.sh {}
permute2.sh (taking 100 random lines from a file and performs some actions on it for permutation)
id=$1
randomlines=100
awk 'BEGIN{srand();} {a[NR]=$0}
END{for(I=1;I<='$randomlines';I++){x=int(rand()*NR);print a[x];}}'
FILE.txt > results/randomlines.$id.txt
# do stuff with randomlines.$id.txt..
When I run permute1.sh I can see it creates 144 files, for each cpu 1 (randomlines.1.txt - randomlines.144.txt), but most of them are empty and stopped working, and some are completed. What am I doing wrong?