Resuming Download In Wget, cURL & aria2c


Read {count} times since 2020

We all download stuffs and we want it to be perfect. No lagging, just download the fastest way, right ?

I don’t download via browsers these days, because it is too slow (< 5 KB/s). Google Chrome downloads are the worst. So, I download via command line now.

Not all downloading is easy. Sometimes it will break up. So, we must resume it. Only some servers allow to resume downloads. But, we need client softwares that is able to resume downloads.

There are lots of software that can be used to download in Linux and Windows. Most of them are GUIs. But, I like using commands.

In this post, I’m going to say how to resume downloads in wGetcURL and aria2c. Since, I use Ubuntu, you should read the stuff Ubuntu Linux – wise.

cURL

If you use PHP, you can see that it has a default cURL extension. It is one of the most popular tools to download. But, it is complicated and not as easy as wGet or aria2cChrome uses cURL and you can get the cURL command of a file using the Developer Tools (F12) in Chrome.

Install it by :

sudo apt-get install curl

And here is the command to download

curl -L -C - -o "myfile.zip" "http://example.com/file.zip"

Here is what the options mean :

<td>
  <h2>
    <a href="#what-it-does"><span id="what-it-does">What It Does</span></a>
  </h2>
</td>
<td>
  Some URLs redirect to some other sites. This option tells cURL to obtain the file from that redirected location.
</td>
<td>
  This option is for telling cURL to resume download and the dash ("-") that follows it is to automatically detect the size of the file to continue downloading.
</td>
<td>
  This parameter says the output location of the file being downloaded.
</td>

Option

-L
-C –
-o "myfile.zip"

The last quoted value is the URL of the file. This command can be used to download for the first time as well as resuming it later.

aria2c

What a funny, complicated name, huh ? This command is the easiest one. I use this now to download.

You can install it by manually compiling. For Linux, download the plain .tar.xz{.name} file. Here’s the manual compiling commands to use in the directory :

./configure && sudo make && sudo make install

Now, here’s the command to download files :

aria2c -c -m 0 -o "myfile.zip" "http://example.com/file.zip"

Here is the explanation of each options used in the above command :

<td>
  <h2>
    <a href="#what-it-does-2"><span id="what-it-does-2">What It Does</span></a>
  </h2>
</td>
<td>
  Tells aria2c to resume downloads
</td>
<td>
  This option will make aria2c retry downloads for unlimited times. Some servers tend to break connection after some time. But, this option makes aria2c to auto retry when such things happen.
</td>
<td>
  This parameter says the output location of the file being downloaded.
</td>

Option

-c
-m 0
-o "myfile.zip"

Like before, the last quoted value is the URL of the file.

wGet

wGet is another popular downloading software. It’s use is similar to that of the other tools. But, one option is different.

Install it by :

sudo apt-get install wget

Here is a resumable download command :

wget -d -c --tries=0 --read-timeout=30 -O "myfile.zip" "http://example.com/file.zip"

And here is the explanation of the options used :

<td>
  <h2>
    <a href="#what-it-does-3"><span id="what-it-does-3">What It Does</span></a>
  </h2>
</td>
<td>
  Tells wGet to Download the File
</td>
<td>
  Tells wGet to resume download
</td>
<td>
  Tells wGet to retry connections unlimitedly when it is interrupted.
</td>
<td>
  If no response is received in 30 seconds, exit and establish a new connection
</td>
<td>
  Sets the output location of the downloaded file.
</td>

Option

-d
-c
–tries=0
–read-timeout=30
-O "myfile.zip"

Of the above 3, I would certainly recommend using aria2c, because it’s easy to use and is fast. Have fun downloading .. 🙂