Download Multiple Files Wget For Mac
The GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies. Recently, I was downloading a Ubuntu Linux ISO (618 MB) file for testing purpose at my home PC. My Uninterrupted Power Supply (UPS) unit was not working. I started download with the following wget command:$ wget http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
However, due to power supply problem, my computer rebooted at 98% download. Again, after reboot I typed wget at a shell prompt:$ wget http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
However, wget restarted to download ISO image from scratch again. I thought wget should resume partially downloaded ISO file.
wget resume download
The method for using wget to download files is: Generate a list of archive.org item identifiers (the tail end of the url for an archive.org item page) from which you wish to grab files. Create a folder (a directory) to hold the downloaded files Construct your wget command to retrieve the desired files. Download Multiple Files / URLs Using Wget -i. First, store all the download files or URLs in a text file as: $ cat download-file-list.txt URL1 URL2.
The hunter call of the wild wiki. After reading wget(1), I found the -c or --continue option to continue getting a partially downloaded file. This is useful when you want to finish a download started by a previous instance of wget, or by another program. The syntax is:
So I decided to continue getting a partially-downloaded ubuntu-5.10-install-i386.iso file using the following command:$ wget -c http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
OR$ wget --continue http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
Sample session:
Make sure your run wget command in the same directory where the first download started. If there is a file named ubuntu-5.10-install-i386.iso in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file. Thus, it will result in saving both time and bandwidth.
For more information about the wget, read man pages:$ man wget
$ wget --help
See also:
- Man pages – wget(1),curl(1)
ADVERTISEMENTS