TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Why doesn't Linux apt use HTTPS?

44 点作者 thecodeboy将近 6 年前

12 条评论

dalf将近 6 年前
Previous post: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18958679" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18958679</a>
DCKing将近 6 年前
&gt; Furthermore, even over an encrypted connection it is not difficult to figure out which files you are downloading based on the size of the transfer2. HTTPS would therefore only be useful for downloading from a server that also offers other packages of similar or identical size.<p>This nonsensical argument again.<p>Eavesdropping on HTTP: <i>inspect the request body and see wich package and version is requested </i>. That&#x27;s it.<p>Eavesdropping on HTTPS: 1) build up a database of package sizes for versions. 2) Reassemble HTTPS traffic to figure out what HTTPS requests are. 3) Account for randomized padding lengths and packages of similar sizes (what if a minor security fix results in the same package size? ) 4) perform a lookup of the package version in your sophisticated database.<p>It&#x27;s not even the same ballpark of complexity. Sure, dedicated targeted semi sophisticated attackers can still eavesdrop your HTTPS connections, but HTTPS sure as heck protects against casual snoopers. <i>Which</i> do you really think is more relevant in the real world? And furthermore what kind of attacker achieves the level of sophistication for such a lookup mechanism, and doesn&#x27;t have the sophistication to screw you over in some other way? There is zero understanding of economics or real-world attacker motivations in this argument.<p>It boggles my mind that there are people so stubborn - or think they&#x27;re so clever - that they rather set up a dedicated website with a &quot;well, actually&quot; argument only based in pure technology. They do this instead of thinking critically about this and work towards giving people sane defaults.
lol768将近 6 年前
Let&#x27;s ignore the issue of integrity and look at confidentiality:<p>• Browsers will reuse the same TCP connection when downloading multiple resources. Does apt not do this? This seems like it would make inferring package versions and names difficult.<p>• Is it impractical to standardize on a fixed block size that works for most packages, and just add noise as required to &#x27;top up&#x27; the size of the payload to match the same size as all the others?<p>I found these articles interesting:<p>• <a href="https:&#x2F;&#x2F;tools.ietf.org&#x2F;html&#x2F;draft-pironti-tls-length-hiding-01" rel="nofollow">https:&#x2F;&#x2F;tools.ietf.org&#x2F;html&#x2F;draft-pironti-tls-length-hiding-...</a><p>• <a href="https:&#x2F;&#x2F;hal.inria.fr&#x2F;hal-00732449&#x2F;document" rel="nofollow">https:&#x2F;&#x2F;hal.inria.fr&#x2F;hal-00732449&#x2F;document</a><p>Also, is there an actual PoC for any of these size-related side channel attacks? I&#x27;d take it all a lot more seriously if there was one.
crooked-v将近 6 年前
Yet again, the perfect gets treated as the enemy of the good.
lucideer将近 6 年前
I&#x27;m not sure when this site was last updated, but I&#x27;m guessing this is being posted here due to the edits added to the top of the post this year, most notably the link to CVE-2019-3462[0] which was reported in January.<p>The last time I read whydoesaptnotusehttps.com the tone of the article seemed disappointingly in favour of the status quo. The intro to the article now seems much more open to change.<p>(this site isn&#x27;t on the Wayback Machine, so I&#x27;m going on memory—not sure how significantly the article has actually changed)<p>[0] <a href="https:&#x2F;&#x2F;lists.debian.org&#x2F;debian-security-announce&#x2F;2019&#x2F;msg00010.html" rel="nofollow">https:&#x2F;&#x2F;lists.debian.org&#x2F;debian-security-announce&#x2F;2019&#x2F;msg00...</a>
olliej将近 6 年前
No one serious uses https as the authentication for raw packages - google, Apple, and Microsoft all sign updates&#x2F;software&#x2F;whatever with separate keys.<p>They also aggressively pin those connections.<p>However because they’re serving over https a mitm can only DoS the update system: they can’t change the update or dependency lists, they can’t insert malicious content into those responses, they can’t add cookies to the requests and responses.<p>Privacy can also be fixed if you simply pull multiple resources over the same connection (which is also faster)<p>Just use https.
digitalsushi将近 6 年前
In my corporate environment, we are prohibited from using HTTP but we do not require our certificates to be up to date. Since the proxy does not allow Internet access except to a whitelist of hosts, we have to do something like this in order to take an ubuntu iso from a vendor, and convert it into an os template that my company can use:<p>echo &quot;Acquire::http::Proxy \&quot;<a href="http:&#x2F;&#x2F;personal-cntlm-proxy:3128\&quot;;&quot;" rel="nofollow">http:&#x2F;&#x2F;personal-cntlm-proxy:3128\&quot;;&quot;</a> &gt; &#x2F;etc&#x2F;apt&#x2F;apt.conf<p>apt-get install -y apt-transport-https<p>echo &quot;deb [trusted=yes] <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic main universe multiverse&quot; &gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>echo &quot;deb [trusted=yes] <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic-updates main universe multiverse&quot; &gt;&gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>echo &quot;deb [trusted=yes] <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic-security main universe multiverse&quot; &gt;&gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>echo &quot;Acquire::https::Verify-Peer \&quot;false\&quot;;&quot; &gt; &#x2F;etc&#x2F;apt&#x2F;apt.conf.d&#x2F;80ssl-exceptions<p>echo &quot;Acquire::https::Verify-Host \&quot;false\&quot;;&quot; &gt;&gt; &#x2F;etc&#x2F;apt&#x2F;apt.conf.d&#x2F;80ssl-exceptions<p>apt-get -y install ca-certificates # and now the server is trusted finally<p>echo &quot;deb <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic main universe multiverse&quot; &gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>echo &quot;deb <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic-updates main universe multiverse&quot; &gt;&gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>echo &quot;deb <a href="https:&#x2F;&#x2F;someserver&#x2F;somedir" rel="nofollow">https:&#x2F;&#x2F;someserver&#x2F;somedir</a> bionic-security main universe multiverse&quot; &gt;&gt; &#x2F;etc&#x2F;apt&#x2F;sources.list<p>rm &#x2F;etc&#x2F;apt&#x2F;apt.conf.d&#x2F;80ssl-exceptions<p>probably not even anywhere near the prescribed way to do this, but everything in corporate america has a few extra dance steps.
评论 #20462060 未加载
dvh将近 6 年前
In my previous work I could not install through apt package libelf1 because &quot;f1&quot; was banned on company firewall.
LoSboccacc将近 6 年前
because packages are signed, which gives the same level of trust as a https certificate against tampering (or even better, think how weak a guarantee is a let&#x27;s encrypt certificate) but allows to delegate hosting to infinite untrusted mirrors.
edf13将近 6 年前
Haven’t we been through this a few times before?
评论 #20464136 未加载
overcast将近 6 年前
whydoesthisurlexist.com
评论 #20461801 未加载
0x8BADF00D将近 6 年前
The only way to be 99% sure your packages aren’t tampered with is to use a source-based package management tool. Even then there’s no guarantee, as you are placing your trust into the package maintainers and contributors of that particular package.
评论 #20461360 未加载