1

I can get all files on the bash patches site by downloading them in a sequence:

SEQ=$(seq -f "%03g" 1 30)
for i in $SEQ; do 
  wget http://ftp.gnu.org/gnu/bash/bash-4.3-patches/bash43-$i;
done

But then I would have to know the maximum number.

Is there a possibility to just get the listing and extract all patchfiles for downloading?

rubo77
  • 27,777
  • 43
  • 130
  • 199
  • Why not recursively download the directory? Also see: http://unix.stackexchange.com/q/118605/70524 – muru Oct 06 '14 at 15:30
  • because it is a website, How would i extract the wget links? – rubo77 Oct 06 '14 at 15:31
  • Something like: http://unix.stackexchange.com/q/53397/70524 combined with http://unix.stackexchange.com/a/84017/70524 – muru Oct 06 '14 at 15:32

2 Answers2

5

You could use wget with recursive downloading:

wget -nc -nd -nH -np -r -R '*.*'  http://ftp.gnu.org/gnu/bash/bash-4.3-patches/

Explanation:

  • -nc: no-clobber (don't overwrite existing files), probably not necessary.
  • -nd: Don't create hierarchy of directories.
  • -nH: Don't create directory based on hostname. Or you'd find everything downloaded to a directory called ftp.gnu.org.
  • -np: Never ascend to the parent directory.
  • -r: Download recursively.
  • -R '*.*': Reject everything with a . in its filename (skips things like index.html and so on). An accept list may also be used. The file is downloaded, but discarded.
muru
  • 69,900
  • 13
  • 192
  • 292
0
for i in $(seq -f "%03g" 1 999); do 
  wget http://ftp.gnu.org/gnu/bash/bash-4.3-patches/bash43-$i
  if [[ $? -ne "0" ]]; then
    MAX=$(expr $i - 1)
    break;
  fi
done
echo $MAX files downloaded
rubo77
  • 27,777
  • 43
  • 130
  • 199