TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Curl vs. Wget

532 pointsby ofabout 9 years ago

37 comments

JonathonWabout 9 years ago
For my usage:<p>* Wget&#x27;s the interactive, end-user tool, and my go-to if I just need to download a file. For that purpose, its defaults are more sane, its command line usage is more straightforward, its documentation is better-organized, and it can continue incomplete downloads, which curl can&#x27;t.<p>* Curl&#x27;s the developer tool-- it&#x27;s what I&#x27;d use if I were building a shell script that needed to download. The command line tool is more unix-y by default (outputs to stdout) and it&#x27;s more flexible in terms of options. It&#x27;s also present by default on more systems-- of note, OSX ships curl but <i>not</i> wget out of the box. Its backing library (libcurl) is also pretty nifty, but not really relevant to this comparison.<p>This doesn&#x27;t really need to be an &quot;emacs vs. vim&quot; or &quot;tabs vs. spaces&quot;-type dichotomy: wget and curl do different things well and there&#x27;s no reason why both shouldn&#x27;t coexist in one&#x27;s workflow.
评论 #11214675 未加载
评论 #11215474 未加载
评论 #11214684 未加载
评论 #11213798 未加载
评论 #11213846 未加载
评论 #11214649 未加载
评论 #11213749 未加载
评论 #11216670 未加载
评论 #11214778 未加载
评论 #11215024 未加载
评论 #11215782 未加载
评论 #11215906 未加载
falcolasabout 9 years ago
My favorite use of wget: mirroring web documentation to my local machine.<p><pre><code> wget -r -l5 -k -np -p https:&#x2F;&#x2F;docs.python.org&#x2F;2&#x2F; </code></pre> Rewrites the links to point local where appropriate, and the ones which are not local remain links to the online documentation. Makes for a nice, seamless experience while browsing documentation.<p>I also prefer wget to `curl -O` for general file downloads, simply because wget will handle redirects by default, `curl -O` will not. Yes, I could remember yet another argument to curl... but why?<p>That said, I love curl (combined with `jq`) for playing with rest interfaces.
评论 #11214201 未加载
评论 #11213835 未加载
评论 #11218913 未加载
评论 #11214759 未加载
评论 #11216049 未加载
评论 #11214977 未加载
评论 #11214115 未加载
评论 #11214227 未加载
评论 #11214176 未加载
geerlingguyabout 9 years ago
Also putting this out there—for nicer REST API interaction on the CLI, and a little more user-friendliness, you might also want to add HTTPie[1] to your toolbelt.<p>It&#x27;s not going to replace curl or wget usage, but it is a nicer interface in certain circumstances.<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;jkbrzt&#x2F;httpie" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;jkbrzt&#x2F;httpie</a>
评论 #11214919 未加载
评论 #11213976 未加载
评论 #11213791 未加载
评论 #11213646 未加载
评论 #11216762 未加载
ubercowabout 9 years ago
Though only briefly mentioned in this article at the buttom, I&#x27;d like to give a huge shoutout to aria2. I use it all the time for quick torrent downloads, as it requires no daemon and just seeds until you C-c. It also does a damn good job at downloading a list of files, with multiple segments for each.
评论 #11213701 未加载
Rauchgabout 9 years ago
I instinctively go to `wget` when I need to, uhm, get the file into my computer[1]. `curl -O` is a lot more effort :P<p>Other than that, curl is always better.<p>[1] Aliasing `wget` to ~`curl -O` might be a good idea :)
评论 #11213432 未加载
评论 #11213605 未加载
mooredsabout 9 years ago
&quot;Wget can be typed in using only the left hand on a qwerty keyboard!&quot;<p>I love both of these, but wish that curl was just like wget in that the default behavior was to download a file, as opposed to pipe it to stdout. (Yes, aliases can help, I know.)
评论 #11213452 未加载
songabout 9 years ago
I use wget when I need to download things.<p>curl is for everything else (love it when it comes to debugging some api)... Httpie is not bad too for debugging but most of them time I forget to use it.
Veratyrabout 9 years ago
Since aria2 was only passingly mentioned, let me list some of its features:<p>- Supports splitting and parallelising downloads. Super handy if you&#x27;re on a not-so-good internet connection.<p>- Supports bittorrent.<p>- Can act as a server and has a really nice XML&#x2F;JSON RPC interface over HTTP or WebSocket (I have a Chrome plugin that integrates with this pretty nicely).<p>They&#x27;re not super important features sure but I stick with it because it&#x27;s typically the fastest tool and I hate waiting.
hmsimhaabout 9 years ago
Curl gets another point for having better SNI support, as wget versions until relatively recently didn&#x27;t support it.<p>This means you can&#x27;t securely download content using relatively recent (but not the newest) versions of wget (such as any in the Ubuntu 12.04 repos) from a server which uses SNI, unless the domain you&#x27;re requesting happens to be the default for the server.<p>As an example, I found the file <a href="https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css" rel="nofollow">https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css</a> only accessible with SNI. Try `wget <a href="https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css`" rel="nofollow">https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css`</a> vs. `curl -O <a href="https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css`" rel="nofollow">https:&#x2F;&#x2F;redbot.org&#x2F;static&#x2F;style.css`</a> on Ubuntu 12.04. Domain names which point to S3 buckets (and likely other CDNs) will have similar issues.
josteinkabout 9 years ago
For me defaults matter... 99% of the time when I want to use wget or curl, I want to do it to download a file, so I can keep working on it, from the filesystem.<p>wget does that without any parameters. Curl requires me to remember and provide parameters for this obvious usecase.<p>So wget wins every time.
contingenciesabout 9 years ago
If nobody&#x27;s tried it, <i>axel</i> mentioned in the report as possibly abandoned has the awesome feature of splitting a download in to parts and then establishing that many concurrent TCP connections. Very useful on individual TCP flow rate-limited networks.
评论 #11214305 未加载
评论 #11214106 未加载
dallbeeabout 9 years ago
We are forgetting our long lost cousin, fetch. <a href="http:&#x2F;&#x2F;www.unix.com&#x2F;man-page&#x2F;FreeBSD&#x2F;1&#x2F;FETCH&#x2F;" rel="nofollow">http:&#x2F;&#x2F;www.unix.com&#x2F;man-page&#x2F;FreeBSD&#x2F;1&#x2F;FETCH&#x2F;</a>
xd1936about 9 years ago
wget has the amazing flag `--page-requisites` though, which downloads all of an html documents&#x27; css and images that you might need to display it properly. Lifesaver.
评论 #11214777 未加载
et2oabout 9 years ago
After vi vs. emacs, this is truly the great debate of our generation.
评论 #11213659 未加载
DannyBeeabout 9 years ago
Really interesting. Under curl he has:<p>&quot;Much more developer activity. While this can be debated, I consider three metrics here: mailing list activity, source code commit frequency and release frequency. Anyone following these two projects can see that the curl project has a lot higher pace in all these areas, and it has been so for 10+ years. Compare on openhub&quot;<p>Under wget he has: &quot;GNU. Wget is part of the GNU project and all copyrights are assigned to FSF. The curl project is entirely stand-alone and independent with no organization parenting at all with almost all copyrights owned by Daniel.&quot;<p>Daniel seems pretty wrong here. Curl does not require copyright assignment to him to contribute, and so, really, 389 people own the copyright to curl if the openhub data he points to is correct :)<p>Even if you give it the benefit of the doubt, it&#x27;s super unlikely that he owns &quot;almost all&quot;, unless there really is not a lot of outside development activity (so this is pretty incongruous with the above statement).<p>(I&#x27;m just about to email him with some comments about this, i just found it interesting)
评论 #11259837 未加载
nowprovisionabout 9 years ago
Unmentioned in the article - Curl supports --resolve, this single feature helps us test all sorts of scenarios for HTTPS and hostname based multiplexing where DNS isn&#x27;t updated or consistent yet, e.g. transferring site, bringing up cold standbys, couldn&#x27;t live without it (well I could if I wanted to edit &#x2F;etc&#x2F;hosts continuously)
jonalmeidaabout 9 years ago
wget was the first one I learned how to use by trying to recursively download a professor&#x27;s course website for offline use, and then learning that they hosted the solutions to the assignments there as well..<p>I did well in that course, granted it was an easy intro to programming one. ;)
hartatorabout 9 years ago
&gt; Wget requires no extra options to simply download a remote URL to a local file, while curl requires -o or -O.<p>I think this is oddly the major reason why wget is more popular. Saving 3 chars + not having to remember the specific curl flag seems to matter more than what we can think.
评论 #11215460 未加载
foscoabout 9 years ago
Curl scripts allow open connection to view all new logs in a session.<p>can wget do similar? I did not know it can or could however from my point of view if it cannot this is like comparing a philips head screwdriver to a powertool with 500pc set.
评论 #11213595 未加载
notfossabout 9 years ago
aria2 is much more reliable when downloading stuff, especially for links which involve redirections.<p>For example here&#x27;s a link to download 7zip for windows from filehippo.com.<p>Results:<p>* Curl doesn&#x27;t download it at all.<p><pre><code> curl -O &#x27;http:&#x2F;&#x2F;filehippo.com&#x2F;download&#x2F;file&#x2F;bf0c7e39c244b0910cfcfaef2af45de88d8cae8cc0f55350074bf1664fbb698d&#x2F;&#x27; </code></pre> gives:<p><pre><code> curl: Remote file name has no length! </code></pre> * Wget manages to download the file, but with the wrong name.<p><pre><code> wget &#x27;http:&#x2F;&#x2F;filehippo.com&#x2F;download&#x2F;file&#x2F;bf0c7e39c244b0910cfcfaef2af45de88d8cae8cc0f55350074bf1664fbb698d&#x2F;&#x27; </code></pre> gives:<p><pre><code> 2016-03-03 18:08:21 (75.9 KB&#x2F;s) - ‘index.html’ saved [1371668&#x2F;1371668] </code></pre> * aria2 manages to download the file with the correct name with no additional switches.<p><pre><code> aria2c &#x27;http:&#x2F;&#x2F;filehippo.com&#x2F;download&#x2F;file&#x2F;bf0c7e39c244b0910cfcfaef2af45de88d8cae8cc0f55350074bf1664fbb698d&#x2F;&#x27; </code></pre> gives:<p><pre><code> 03&#x2F;03 18:08:45 [NOTICE] Download complete: &#x2F;tmp&#x2F;7z1514-x64.exe</code></pre>
评论 #11218995 未加载
awjrabout 9 years ago
Useful to know: If you use the Chrome dev tools, in the network tab, you can right click on a request and &quot;Copy as cURL&quot;.
dorfsmayabout 9 years ago
My usage pattern has been:<p><pre><code> - wget to download files (or entire sites even) - curl to debug everything http&#x2F;https</code></pre>
haxporabout 9 years ago
For certain case like creating a Telegram bot which has no interaction with browser, do you think we can make use of curl (post request) to make PHP session works?<p>As there&#x27;s no browser interaction in Telegram bot, the script just receives response back from Telegram server. This might help to kerp track of user state without a need of db?
X-Istenceabout 9 years ago
I use curl because it is generally installed. I prefer not to install wget, especially on customer machines because it stops 90% of script kiddies. For some reason wget is the only tool they will attempt to use to download their sploit.
评论 #11215358 未加载
MoSalabout 9 years ago
I should probably write a &quot;saldl vs. others&quot; page someday.<p>&gt; <i>Wget supports the Public Suffix List for handling cookie domains, curl does not.</i><p>This is outdated info. (lib)curl can be built with libpsl support since 7.46.0.
评论 #11216292 未加载
sametmaxabout 9 years ago
Nowaday I just use httpie. It&#x27;s in Python, so easy to install in windows, and let me work easily with requests and responses, inspect the content, add coloration, etc. Plus the syntax is much easier.
jrbapnaabout 9 years ago
I like Wget&#x27;s option to continue a file download if it gets interrupted. I believe you can achieve the same thing in curl but its not as simple as just setting a flag (-c).
dominhhaiabout 9 years ago
&gt;&gt; Wget can be typed in using only the left hand on a qwerty keyboard!<p>Great!
arca_voragoabout 9 years ago
Wget is under GPLv3 so thats what I use more often. Sometimes I will use curl in certain cases, but yes, I will use a GPL product over a non-gpl product if given a choice.
cushychickenabout 9 years ago
The &quot;only need a left hand&quot; sways me for wget.
StreamBrightabout 9 years ago
There is no other industry where tools are debated so much as in IT. We literally waste tonns of hours on arguing over minor differences and nuances that really should not matter that much.
评论 #11213690 未加载
评论 #11213403 未加载
评论 #11213781 未加载
评论 #11214100 未加载
mistatabout 9 years ago
curl for checking http headers simply with: curl -vskL http:1.2.3.4 -H &quot;Host: example.com&quot; &gt; &#x2F;dev&#x2F;null
评论 #11216080 未加载
dustingetzabout 9 years ago
Who funds projects like this?
module17about 9 years ago
TLDR: curl rocks.
评论 #11215379 未加载
ProceedsNowabout 9 years ago
Wget just werks.
mynewtbabout 9 years ago
You should all check out wpull!
评论 #11213723 未加载
评论 #11213748 未加载
评论 #11213939 未加载
kevilleabout 9 years ago
Everything 6 years old is new again: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1241479" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1241479</a>
评论 #11213948 未加载