How to Download a File or Directory Using wget?

If you are spending too much time on a terminal, there is a good chance that you would like to download a file or a directory from the web, but without really using a browser. You can use the command line utility “wget” to download a file or a directory right from your terminal.

The beauty of wget is that its is non-interactive, meaning that wget can quitely work in the background. wget manual explains that the non-interactive wget “allows you to start a retrieval and disconnect from the system, and wget gets the job done”.

How to Use wget to download a file from website?

To download a file using wget, just use wget with the web address of the file name. For example

wget http://www.google.com/doodle4google/history.html

will fetch you the history.html file from Google’s doodle4google directory. While fetching the file you will see whet printing the following things on your screen

–2011-08-17 15:52:28– http://www.google.com/doodle4google/history.html
Resolving www.google.com… 74.125.113.103, 74.125.113.104, 74.125.113.105, …
Connecting to www.google.com|74.125.113.103|:80… connected.
HTTP request sent, awaiting response… 200 OK
Length: unspecified
Saving to: `history.html.1′

[ <=> ] 9,889 –.-K/s in 0.02s

2011-08-17 15:52:28 (480 KB/s) – `history.html.1′ saved [9889]

How to Use wget to Download a file in the background?

Making wget to run in the background is very simple. Add “&” at the end of the command as follows.

wget  http://www.google.com/doodle4google/history.html &

How to Use wget to try multiple times to download a file?

Want to download a big file, but have a slower/unreliable internet connectivity? Don’t worry “wget” can easily handle that. You can specify the number of times wget should keep trying, in case it fails

wget ---tries 10 http://www.google.com/doodle4google/history.html &

–tries or -t option gives you an opportunity to specify the number of times you want to try to download.

How to use wget to download an entire directory from a website?

Perhaps, the utility wget is most useful in downloading an entire directory from the web.

wget -r http://path/to/the/directory