Questions tagged [download]

Copying data from one device to another over the internet.

This tag should be used in questions that concern downloading data.

308 questions
461
votes
5 answers

What is the difference between curl and wget?

I am keen to know the difference between curl and wget. Both are used to get files and documents but what the key difference between them. Why are there two different programs?
lakshmen
  • 6,071
  • 5
  • 17
  • 9
136
votes
4 answers

Throttle the download speed of wget or curl while downloading

Is it possible to throttle (limit) the download speed of wget or curl ? Is it possible to change the throttle value while it is downloading ?
Gautam
  • 2,295
  • 5
  • 18
  • 18
95
votes
8 answers

How to download a folder from google drive using terminal?

I want to download a folder from my google drive using terminal? Is there any way to do that? I tried this: $ wget "https://drive.google.com/folderview?id=0B-Zc9K0k9q-WWUlqMXAyTG40MjA&usp=sharing" But it is downloading this text file:…
user22180
  • 1,123
  • 1
  • 8
  • 8
88
votes
9 answers

How to download package not install it with apt-get command?

sudo apt-get install pppoe will download pppoe package and install it. Is it possible to just download pppoe package and not install it with apt-get command? wget…
showkey
  • 79
  • 23
  • 67
  • 128
54
votes
4 answers

Resume failed download using Linux command line tool

How do I resume a partially downloaded file using a Linux commandline tool? I downloaded a large file partially, i.e. 400 MB out of 900 MB due to power interruption, but when I start downloading again it resumes from scratch. How do I start from 400…
amolveer
  • 779
  • 2
  • 7
  • 12
43
votes
9 answers

How to download files and folders from Onedrive using wget?

How to use wget to download files from Onedrive? (and batch files and entire folders, if possible)
charles
  • 906
  • 2
  • 8
  • 8
42
votes
3 answers

How do I use wget with a list of URLs and their corresponding output files?

Suppose list_of_urls looks like this: http://www.url1.com/some.txt http://www.url2.com/video.mp4 I know how to use that with: wget -i list_of_urls But, what if my list_of_urls has this, and they all return proper files like PDF's or…
Kit
  • 1,023
  • 3
  • 10
  • 13
39
votes
2 answers

How do I distribute a large download over multiple computers?

I need to download a large file (1GB). I also have access to multiple computers running Linux, but each is limited to a 50kB/s download speed by an admin policy. How do I distribute downloading this file on several computers and merge them after all…
B Faley
  • 4,213
  • 11
  • 37
  • 48
37
votes
2 answers

Download files and create same file structure as the source

I have a config file which consists of list of URIs I want to download. For example, http://xyz.abc.com/Dir1/Dir3/sds.exe http://xyz.abc.com/Dir2/Dir4/jhjs.exe http://xyz.abc.com/Dir1/itr.exe I want to read the config file and and copy each…
NGambit
  • 495
  • 2
  • 5
  • 7
36
votes
7 answers

Download big file over bad connection

Is there an existing tool, which can be used to download big files over a bad connection? I have to regularly download a relatively small file: 300 MB, but the slow (80-120 KBytes/sec) TCP connection randomly breaks after 10-120 seconds. (It's a big…
Crouching Kitten
  • 613
  • 1
  • 6
  • 14
27
votes
4 answers

wget - How to download recursively and only specific mime-types/extensions (i.e. text only)

How to download a full website, but ignoring all binary files. wget has this functionality using the -r flag but it downloads everything and some websites are just too much for a low-resources machine and it's not of a use for the specific reason…
Omar Al-Ithawi
  • 381
  • 1
  • 3
  • 7
23
votes
8 answers

command-line tool for a single download of a torrent (like wget or curl)

I'm interested in a single command that would download the contents of a torrent (and perhaps participate as a seed following the download, until I stop it). Usually, there is a torrent-client daemon which should be started separately beforehand,…
imz -- Ivan Zakharyaschev
  • 15,113
  • 15
  • 61
  • 123
19
votes
3 answers

Resume an aria2 downloaded file by its *.aria2 file

I have a partially downloaded file with aria2. Next to it, is a file with the same name end with .aria2. I don't know the download link. I only have these two files. I want to know how could I resume download in this situation. Note: *.aria2 is…
r004
  • 3,339
  • 8
  • 29
  • 51
16
votes
13 answers

Is there a way to download pure Unix?

I'm just asking out of curiosity, is there a way to obtain a 'pure' so to say copy of Unix? So, not OS X or Linux with Unix in the background, but simply Unix..
James Litewski
  • 301
  • 1
  • 2
  • 4
16
votes
2 answers

Wget: convert-links and avoid redownloading already fetched files?

I am downloading data spread among multiple files that don't change once published. Therefore --timestamping is not good enough, because it is constantly checking whatever resource is changed, which is, in my case, completely pointless. --no-clobber…
1
2 3
20 21