TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Verified curl

146 pointsby TangerineDreamabout 1 year ago

10 comments

kpcyrdabout 1 year ago
I&#x27;m not a fan of the &quot;Reproducible tarballs&quot; section, because it&#x27;s explicitly about pre-processing the source code with autotools, instead of distributing a pure, unaltered git snapshot (which `git archive` can already generate in a deterministic way).<p>The section following then mentions signing the pre-processed source code, which I think is the wrong approach. It makes a difficult situation because of how strongly some people encourage signed source code, yet I think autotools is part of the build process and should run on the build server (and double checked by reproducible builds). If people pre-process the .orig.tar.xz they upload to Debian, this pre-processing won&#x27;t be covered by reproducible builds because it happens undocumented.<p>The patch for &quot;reproducible tarballs&quot; is quite involved[0] and has rookie mistakes like &quot;pin a specific container image using `@sha256:...` syntax, but then invoke `apt-get update` and `apt-get install` to install whatever Debian ships at that time&quot;.<p>[0]: <a href="https:&#x2F;&#x2F;github.com&#x2F;curl&#x2F;curl&#x2F;pull&#x2F;13250&#x2F;files">https:&#x2F;&#x2F;github.com&#x2F;curl&#x2F;curl&#x2F;pull&#x2F;13250&#x2F;files</a>
评论 #39988920 未加载
评论 #39988673 未加载
评论 #39988640 未加载
评论 #39991115 未加载
Karellenabout 1 year ago
&gt; We do not provide checksums for the tarballs simply because providing checksums next to the downloads adds almost no extra verification. If someone can tamper with the tarballs, they can probably update the webpage which a fake checksum as well.<p>In the past, I have occasionally downloaded a tarball from one mirror, and verified against a checksum from a different mirror (or from the official website). Back when release announcements were primarily made on mailing lists, using a mailing list archive to get a copy of the &quot;real&quot; checksum was also a possibility.<p>I definitely remember being advised to get the checksum from a different source than the tarball, a number of different times.<p>&gt; Our policy says that if a contrition is good:<p>* contribution
评论 #39988644 未加载
评论 #39988743 未加载
Lockalabout 1 year ago
I don&#x27;t understand his attitude towards &quot;anonymous maintainers&quot;. Right now ALL contributions to curl are pseudonymous, including his own. There is just no such organization as &quot;Curl&quot;. Want to see non-anonymous contributors - go to Google&#x2F;Intel&#x2F;etc, they ask for IDs when they hire employees.<p><pre><code> &gt; A (to me) surprisingly large amount of contributions are done by people who do not state a full real name </code></pre> Again, strange attitude, given that he personally had legal issues with US in the past, the reasons for which were never disclosed[1].<p>Typical good developers are not Rambo. When law enforcers come to them and force them at gunpoint to make them commit&#x2F;add a new maintainer, they should not expect active resistance. Minor reminder: curl is not just some http library, they maintain their own CA list[2]. They don&#x27;t need any intricate hidden lines for backdoor, CA list is a backdoor on its own.<p>[1] <a href="https:&#x2F;&#x2F;daniel.haxx.se&#x2F;us-visa.html" rel="nofollow">https:&#x2F;&#x2F;daniel.haxx.se&#x2F;us-visa.html</a><p>[2] <a href="https:&#x2F;&#x2F;curl.se&#x2F;docs&#x2F;caextract.html" rel="nofollow">https:&#x2F;&#x2F;curl.se&#x2F;docs&#x2F;caextract.html</a>
评论 #39994988 未加载
评论 #39990297 未加载
lrvickabout 1 year ago
In Stagex we already produce a deterministic, full-source-bootstrapped, and oci-native builds of curl.<p>release: <a href="https:&#x2F;&#x2F;hub.docker.com&#x2F;layers&#x2F;stagex&#x2F;curl&#x2F;latest&#x2F;images&#x2F;sha256-dcb686988d8fc52abe879e4a6fb292b2f96103a2f9eedfb83509e3f965a85498?context=explore" rel="nofollow">https:&#x2F;&#x2F;hub.docker.com&#x2F;layers&#x2F;stagex&#x2F;curl&#x2F;latest&#x2F;images&#x2F;sha2...</a><p>source: <a href="https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;packages&#x2F;curl&#x2F;Containerfile" rel="nofollow">https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;packages&#x2F;...</a><p>signatures: <a href="https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;signatures&#x2F;stagex&#x2F;curl@sha256=dcb686988d8fc52abe879e4a6fb292b2f96103a2f9eedfb83509e3f965a85498" rel="nofollow">https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;signature...</a><p>Every package is bootstrapped all the way up from a heavily reproduced 256 byte assembly seed (Stage0&#x2F;live-bootstrap) and built by two or more maintainers with confirmed matching hashes, and with signatures from well known keys.<p>100% of commits in our repos are also signed, and every PR merge also comes with a signed merge commit by the reviewer.<p>Information on how to verify our keys is found in the maintainers file: <a href="https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;MAINTAINERS" rel="nofollow">https:&#x2F;&#x2F;codeberg.org&#x2F;stagex&#x2F;stagex&#x2F;src&#x2F;branch&#x2F;main&#x2F;MAINTAINE...</a><p>If the curl team wants a similar level of supply chain security for their own official binaries or Dockerfile we would suggest cloning our Containerfile and hash-locking all dependencies to the latest stagex release (please by all means reproduce, verify, and sign that too!).<p>This should be easy for curl maintainers to build and get identical hashes for their own release binaries to the ones we build and sign.<p>Stagex could also be used to produce source tarballs with generated files from similarly deterministic&#x2F;multi-signed versions of autotools etc.<p>I am not convinced there is a good case for having any auto-generated files in the source archives though. Force distros to bring their own autotools, etc., imo.
评论 #39991087 未加载
nickelproabout 1 year ago
All of this is nice but none of it stops the style of hack that happened to xz besides the fact that it&#x27;s very unlikely Dan is going to be bullied into handing over maintainership to someone else.<p>Every other element on the list can be attacked if the maintainer themselves is the malicious party carrying out the attack and it is being performed with the level of sophistication in the xz attack.<p>So ya, I&#x27;m not saying it&#x27;s security theater in all contexts, but if the context is the attack vector used in the xz attack, it&#x27;s security theater.
评论 #39988785 未加载
评论 #39988970 未加载
评论 #39990077 未加载
orthoxeroxabout 1 year ago
&gt; A (to me) surprisingly large amount of contributions are done by people who do not state a full real name.<p>Why would someone state their full real name on the internet to contribute to curl? If someone wants to boost their CV, they can just write &quot;curl, Linux kernel, GCC contributor&quot; and provide the link to my GitHub profile upon request. Yes, someone in HR will (gasp) learn that orthoxerox is actually called Boris Kozlov in meatspace, but there&#x27;s no need to broadcast this information.
评论 #39989458 未加载
WirelessGigabitabout 1 year ago
I went as far as building code in PRs and reusing those blobs when the code gets merged in into main. I find it super weird that in many projects the code gets rebuild and packaged on merge to main, and when released, all while a lot of underlying processes aren&#x27;t locked.<p><pre><code> apt install xyz </code></pre> can be different.<p>a python requirements.txt as well. (which is why we use Poetry).
pornelabout 1 year ago
C build systems are so cursed. Autotools is obviously horrible, but projects like curl cling to it, because somehow every other C build system sucks too.<p>It&#x27;s frankly amazing how deeply fragmented and backwards-looking C is that this continues to be a problem. In the last 30 years, a countless number of C build systems have been created, and yet an ugly pile of obscure macros outlives them all.
hsbauauvhabzbabout 1 year ago
Great, one package down, 8000 more dependencies to go
评论 #39988728 未加载
评论 #39989477 未加载
Canadaabout 1 year ago
A compression library has a better excuse than most to have weird binary blobs in the test suite.
评论 #39988762 未加载