TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The curl-wget Venn diagram

285 pointsby TangerineDreamover 1 year ago

31 comments

vesinisaover 1 year ago
I would also add at least &quot;sane default options&quot;, &quot;continues downloads&quot; and &quot;retries on error&quot; to the Wget column. I recently had to write a script that downloads a very large file over a somewhat unreliable connection. The common wisdom among the engineers is that you need to use Wget for this job. I tried using curl but out of the box it could not resume or retry the download. I would have to study the manual and specify multiple options with arguments for this behaviour that really sounds something that should just work out of the box.<p>Wget needed one option to enable resuming in all conditions, even after a crash: --continue<p>Wget&#x27;s introdcution in the manual page also states: &quot;Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved.&quot;<p>I was sold. Even if I by some miracle managed to get all the options for curl to enable reliable performance over poor connection right, Wget seems to have those on by default and the sane defaults make me believe it will also have this expected correct behaviour enabled even for those error scenarios that I did not think to test myself. Or - if the HTTP protocol ever receives updates, newer versions of Wget will also support those by default, but curl will require new switches to enable the enhanced behaviour - something which I can not add after the product has been shipped.<p>To me it often seems like curl is a good and extremely versatile low-level tool, and the CLI reflects this. But for my everyday work, I prefer to use Wget as it seems to work much better out of the box. And the manual page is much faster to navigate - probalby in part due to just not supporting all the obscure protocols called out on this page.
评论 #37380040 未加载
评论 #37379178 未加载
评论 #37379868 未加载
评论 #37379222 未加载
评论 #37380012 未加载
评论 #37379665 未加载
评论 #37379294 未加载
评论 #37380156 未加载
评论 #37381101 未加载
评论 #37380359 未加载
评论 #37381157 未加载
评论 #37380796 未加载
评论 #37379209 未加载
评论 #37379831 未加载
red_admiralover 1 year ago
For many of us, I bet the key distinction is &quot;the one that writes to stdout by default&quot; vs &quot;the one that makes a file by default&quot;.
评论 #37381908 未加载
enriqutoover 1 year ago
For me the killer feature of wget is that by default it downloads a file with a name derived from the url.<p>You do:<p><pre><code> wget url:&#x2F;&#x2F;to&#x2F;file.htm </code></pre> and a file named &quot;file.htm&quot; appears in your cwd.<p>Using curl, you would have to do<p><pre><code> curl url:&#x2F;&#x2F;to&#x2F;file.htm &gt; file.htm </code></pre> or some other, less ergonomical, incantation.
评论 #37379072 未加载
评论 #37379162 未加载
评论 #37379484 未加载
评论 #37379080 未加载
评论 #37382118 未加载
ducktectiveover 1 year ago
Daniel Stenberg is among those rare breed of developers who put their heart and soul into their creation, a fading trait in the modern world of big tech that shadowy developers seem to be replaceable cogs of a money-making machine.<p>It&#x27;s as if he treats curl as his mark on the world of IT.
评论 #37379636 未加载
评论 #37379606 未加载
tyingqover 1 year ago
Seems maybe dated. For example, it excludes both of these from wget in the diagram<p>&gt; HTTP PUT<p>wget --method=PUT --body-data=&lt;STRING&gt;<p>&gt; proxies ... HTTPS<p>wget --use-proxy=on --https_proxy=<a href="https:&#x2F;&#x2F;example.com" rel="nofollow noreferrer">https:&#x2F;&#x2F;example.com</a><p>Curl consistently has more options and flexibility, but there&#x27;s several things on the right side of the venn diagram where wget does have some capability.
评论 #37379806 未加载
rob74over 1 year ago
Ok, wow, I didn&#x27;t know that curl supported so many protocols - but the fact remains that that small intersection area is probably what &gt; 90% of curl&#x2F;Wget users are using the tools for. So, from a developer&#x27;s perspective, the overlap is not that big, but from a user&#x27;s perspective it <i>might</i> appear much bigger...
kriroover 1 year ago
The best part of the post for me is:<p>&quot;&quot;&quot;I have contributed code to wget. Several wget maintainers have contributed to curl. We are all friends.&quot;&quot;&quot;
nicceover 1 year ago
Mandatory mention for the comparison made by Daniel Stenberg<p><a href="https:&#x2F;&#x2F;daniel.haxx.se&#x2F;docs&#x2F;curl-vs-wget.html" rel="nofollow noreferrer">https:&#x2F;&#x2F;daniel.haxx.se&#x2F;docs&#x2F;curl-vs-wget.html</a>
评论 #37379088 未加载
coldteaover 1 year ago
In the olden times we used wget when we wanted to mirror a website. It is a specialized tool.<p>Curl is a general purpose request library with a cli frontend (also used embedded from other programs, or as a standard library API in PHP etc).
评论 #37387739 未加载
jaimehrubiksover 1 year ago
I guess the most common usage is the overlap between the two. That&#x27;s why I&#x27;d love to see a Venn diagram of where (OS and docker images) each is installed by default!
johnchristopherover 1 year ago
What are happy eyeballs in the curl circle ?
评论 #37379005 未加载
评论 #37379007 未加载
jiehongover 1 year ago
Looks like wget 2 introduces an equivalent to libcurl: libwget [0][1].<p>[0]: <a href="https:&#x2F;&#x2F;gitlab.com&#x2F;gnuwget&#x2F;wget2" rel="nofollow noreferrer">https:&#x2F;&#x2F;gitlab.com&#x2F;gnuwget&#x2F;wget2</a><p>[1]: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wget#Wget2" rel="nofollow noreferrer">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wget#Wget2</a>
jiofjover 1 year ago
wget&#x27;s &quot;downloads recursively&quot; is worth half the features of curl.
评论 #37380090 未加载
KaiserProover 1 year ago
I&#x27;ve never seen them as competitors!<p>wget is my goto if I need to download a file now, with the minimum of fuss.<p>curl is used when I need to do something fancy with a url to make it work, or when I&#x27;m fiddling with params to make an API work&#x2F;debug it.
评论 #37387798 未加载
tm11zzover 1 year ago
curl is the ffmpeg of url fetching
lurtbancasterover 1 year ago
Couple more things wget can do that curl can&#x27;t.<p>1. wget can resolve onion links. curl can&#x27;t(yet). You&#x27;ll get a<p><pre><code> curl: (6) Not resolving .onion address (RFC 7686) </code></pre> 2. curl has problems parsing unicode characters<p><pre><code> curl -s -A &quot;Mozilla&#x2F;5.0 (Windows NT 10.0; rv:102.0) Gecko&#x2F;20100101 Firefox&#x2F;102.0&quot; https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;GonewildAudible&#x2F;comments&#x2F;wznkop&#x2F;f4m_mi_coño_esta_mojada_summer22tomboy&#x2F;.json </code></pre> will give you a<p><pre><code> {&quot;message&quot;: &quot;Bad Request&quot;, &quot;error&quot;: 400} </code></pre> wget on the other hand, automatically converts the ñ to UTF-8 hex - %C3%B1 - and resolves the link perfectly.<p>I&#x27;ve searched the curl manpage and couldn&#x27;t find a way to solve this. Please help.<p>I&#x27;m having to use `xh --curl` [1] to &quot;fix&quot; the links before I pass them to curl.<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;ducaale&#x2F;xh">https:&#x2F;&#x2F;github.com&#x2F;ducaale&#x2F;xh</a>
safety1stover 1 year ago
Curl can fetch over imap&#x2F;imaps? Can I use it to download and back up my entire mailbox?
评论 #37380092 未加载
Gudover 1 year ago
For the FreeBSD users out there also ‘fetch’ is available.<p>Don’t know what the advantages&#x2F;disadvantages are, but it comes with the default install. It’s usually what I use.
bandergirlover 1 year ago
curl is a connection tool while wget is an app. At a basic level they do the same thing, but they excel in different areas.<p>This diagram is clearly and unapologetically biased towards curl. Feels strange that the author of curl doesn’t know what wget actually offers.
alexchamberlainover 1 year ago
I recently found [axel], which is very impressive wget-like tool for larger files.<p>[axel]: <a href="https:&#x2F;&#x2F;github.com&#x2F;axel-download-accelerator&#x2F;axel">https:&#x2F;&#x2F;github.com&#x2F;axel-download-accelerator&#x2F;axel</a>
评论 #37379845 未加载
bravetravelerover 1 year ago
On the cURL side; ridiculous manual<p>I regularly forget the order for the values for <i>--resolve</i>, try searching for that word and figuring it out quickly<p>I&#x27;ve been relegated to grepping a flippin&#x27; manual
评论 #37380723 未加载
Aissenover 1 year ago
Another thing I forget that wasn&#x27;t supported in wget (but worked in curl) last I checked: IPv6 link-local address scopes (interface names on linux).
quickthrower2over 1 year ago
I find wget is more likely to be on a given system than curl by default so I usually reach for that first. But I am squarely in the middle of the venn.
评论 #37383328 未加载
azatomover 1 year ago
don&#x27;t forget the weekly security fix on the right side ;)
评论 #37379981 未加载
评论 #37379913 未加载
评论 #37379811 未加载
ketanmaheshwariover 1 year ago
I have never seen an example of curl working with SFTP. Does anyone know or have used curl over SFTP?
k255over 1 year ago
warc support is smth wget specific, worth mentioning
wizofausover 1 year ago
Can anyone explain &quot;happy eyeballs&quot;? Did find one page about it, but wasn&#x27;t 100% clear what the use case for it being an option was, or where on earth the name came from...
评论 #37386879 未加载
specialistover 1 year ago
Neat. Love it.<p>Is there a feature matrix to Venn diagram converter?<p>(Deep down) on my To Do list is comparing Ansible, Puppet, Chef, Docker, etc.<p>Which ultimately means some kind of feature matrix, right?<p>With a converter, we&#x27;d get Venns for free.
geocrasherover 1 year ago
To me the real takeaway here isn&#x27;t related to wget or curl. It&#x27;s related to using the right tool for the job, whatever that is.
jpeelerover 1 year ago
Does the webpage parsing functionality of wget only come into play when doing something like an entire site backup?
jgalt212over 1 year ago
For the intersection area, I see no reason to use curl or wget over requests &#x2F; urllib. Assuming one is inside a script.
评论 #37380702 未加载