Downloading a list of URLs automatically

I recently needed to download a bunch of files from Amazon S3, but I didn't have direct access to the bucket — I only had a list of URLs.

There were too many to fetch one by one, so I wanted to fetch them automatically. Here are a couple ways I found to do that.

Using curl #

Curl comes installed on every Mac and just about every Linux distro, so it was my first choice for this task. Turns out it's pretty easy.

Create a new file called files.txt and paste the URLs one per line. Then run the following command.

xargs -n 1 curl -O < files.txt

Curl will download each and every file into the current directory.

Using wget #

If you're on Linux or curl isn't available for some reason, you can do the same thing with wget.

Create a new file called files.txt and paste the URLs one per line. Then run the following command:

wget -i files.txt

Wget will download each and every file into the current directory.

Tip for macOS users: If you want to use wget on macOS, you can install it via Homebrew using homebrew install wget.