-1

sorry for my English....

I to try to Download this Ebook, I tried different methods with Wget, with

wget --domains --no-parent https://www.oreilly.de/german/freebooks/linag3ger

and

wget --domains --no-parent https://www.oreilly.de/german/freebooks/linag3ger/418_LinuxIVZ.html

Only Download the Links for the Site.

With,

wget -r https://www.oreilly.de/german/freebooks/linag3ger/418_LinuxIVZ.html

so, with recursive, Download all Webpage....more as 1GB....

I'll Download it, and late to convert it to PDF, as here to say.

Can please someone help me, Thanks!

dsaj34v
  • 177
  • 2
  • 12
  • I misread the title as "How to download wget with an ebook" – annahri Feb 25 '20 at 00:32
  • Now to change the title **How to download with Wget an eBook**, also when the Preposition **with** is before the name **wget**, I think is cleary what to say... – dsaj34v Feb 25 '20 at 10:27
  • Because of some people, not ready gut the title... I to come **-2** .... – dsaj34v Feb 25 '20 at 10:32
  • @ annahri, How **Konfuzius** to say, **To make a mistake and not to correct it - that is really to make a mistake** – dsaj34v Feb 25 '20 at 10:38

2 Answers2

3

Looks like all you have to do is this:

wget -np -r -l 2 'https://www.oreilly.de/german/freebooks/linag3ger/'

This downloads 49 files, 3.7MB in total.

Basically you need -r to tell wget to download the link recursively. Then -l 2 to tell wget to not go deeper than 2 levels.

SparedWhisle
  • 3,588
  • 4
  • 22
  • 35
  • Thanks!, I have before to try with `-r` but without to tell wget, the deeper... I wish you a nice Day! – dsaj34v Feb 23 '20 at 11:55
  • Info for other people, what **man wget** to say about `-np`: **Do not ever ascend to the parent directory when retrieving recursively. This is a useful option, since it guarantees that only the files below a certain hierarchy will be downloaded.** – dsaj34v Feb 23 '20 at 12:19
  • Can Please someone tell me because to come **-1** ?, I think people which give **-1** , MUSS to say because give it! – dsaj34v Feb 23 '20 at 15:21
  • **-1** more... people as you give me to fear only to think when a Day you are to come more power.... – dsaj34v Feb 23 '20 at 17:47
  • Some people might think it’s not ok to download the book when the website doesn’t provide a download option. That’s my guess. But I think it’s totally ok to download. – SparedWhisle Feb 23 '20 at 20:42
  • I to say it because of every time that I`ll ask a question, to come to a warning that to say that I have many **-1**..., too I think it is people give **-1** without reliable reason... – dsaj34v Feb 24 '20 at 10:02
  • ...Also this Book it is possible download...https://en.wikipedia.org/wiki/Linux_Network_Administrator%27s_Guide – dsaj34v Feb 24 '20 at 10:15
0

You will have to tell wget a criterion what to download (anything below that folder?) and what not (all the rest of the Wiley website).

If you have a GUI available, a tool like "HTTrack Website Copier" may simplify that task.

BurninLeo
  • 633
  • 1
  • 5
  • 9
  • **anything below that folder?**, how to understand, the folder is **linag3ger**... with **https://www.oreilly.de/german/freebooks/** to come a empty Page.... – dsaj34v Feb 23 '20 at 10:13
  • By and large, the crawled URLs must be limited to start with `https://www.oreilly.de/german/freebooks/linag3ger/`. – BurninLeo Feb 26 '20 at 18:06