wcurl vs Wget
The main differences.
File issues or pull-requests if you find problems or have improvements to this comparison.
What both commands do/are
- command line tools that can download URLs from FTP, HTTP(S)
- work without user interaction
- fully open source and free software
- portable and run on many operating systems
- follow redirects automatically
- get the remote file's time and set it locally
- retry on transient errors
How they differ
wcurl
a shell script that invokes curl
downloads URLs in parallel
curl licensed
supports all the URL schemes curl does, many more than wget
happy eyeballs connections
is a little complicated to run on Windows
supports HTTP/1 and HTTP/2 by default, can do HTTP/3 with some determination
Wget / Wget2
a stand-alone executable
downloads URLs serially (Wget2 uses parallel transfers)
supports only HTTP/1 (Wget2 also does HTTP/2, neither does HTTP/3)
Recursive!: Wget's major strong side compared to wcurl is its ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing.
GPL: Wget is GPL v3
enables cookies by default
works on Windows
When to use which
Primarily: use the one that gets the job done for you.