TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Say goodbye to resource-caching across sites and domains

74 pointsby stefanjudisover 4 years ago

18 comments

titzerover 4 years ago
Unfortunately security and efficiency are at odds here.<p>We faced a similar dilemma in designing the caching for compiled Wasm modules in the V8 engine. In theory, it would be great to just have one cached copy of the compiled machine code of a wasm module from the wild web. But in addition to the information leak from cold&#x2F;warm caches, there is the possibility that one site could exploit a bug in the engine to inject vulnerable code (or even potentially malicious miscompiled native code) into one or more other sites. Because of this, wasm module caching is tied to the HTTP cache in Chrome in the same way, so it suffers the double-key problem.
jrochkind1over 4 years ago
Can anyone find any data on how often cache hits happened for shared resources from CDNs anyway? How useful was this actually? I&#x27;m not confident it was a non-trivial portion of bytes fetched by a typical session. But maybe it was. Has anyone found a way to measure in any way?
评论 #24896003 未加载
评论 #24896140 未加载
评论 #24895853 未加载
评论 #24895873 未加载
评论 #24895886 未加载
评论 #24904021 未加载
评论 #24896463 未加载
user5994461over 4 years ago
My guess is the impact of cross site caching is negligible. We&#x27;re losing nothing here.<p>1) Cache hit must be extremely low because of different versions&#x2F;variants&#x2F;CDN for each library. (Have you seen how many jquery there are?).<p>2) It&#x27;s irrelevant outside of the few most popular libraries on the planet, maybe jquery&#x2F;bootstrap&#x2F;googlefonts.<p>3) Content is cached once on page load and the saving are happening over the next tens or hundreds pages you visit. That&#x27;s where the gain of caching is (10x - 100x). Saving 1 load when changing site is negligible in the grand scheme of things.
trampiover 4 years ago
For anyone asking what this means in numbers:<p>&gt; The overall cache miss rate increases by about 3.6%, changes to the FCP (First Contentful Paint) are modest (~0.3%), and the overall fraction of bytes loaded from the network increases by around 4%. You can learn more about the impact on performance in the HTTP cache partitioning explainer. [0]<p>[0]: <a href="https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;updates&#x2F;2020&#x2F;10&#x2F;http-cache-partitioning" rel="nofollow">https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;updates&#x2F;2020&#x2F;10&#x2F;http-cache...</a><p>Additional metrics: <a href="https:&#x2F;&#x2F;github.com&#x2F;shivanigithub&#x2F;http-cache-partitioning#impact-on-metrics" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;shivanigithub&#x2F;http-cache-partitioning#imp...</a>
评论 #24896795 未加载
newscrackerover 4 years ago
&gt; I have mixed feelings about these changes.<p>I feel for those on low bandwidth and low data limit connections. Website developers should focus on bloat and address that. That doesn’t seem to be happening on a larger scale though.<p>&gt; It&#x27;s excellent that browsers consider privacy, but it looks like anything can be misused for tracking purposes these days.<p>Of course. Every bit of information you provide to a site will be misused to track and profile you. That’s what the advertising fueled web has gotten us to (I don’t blame it alone for the problems).<p>I wasn’t aware that Safari had handled the cache privacy issue in 2013. It seems like it has always been way ahead on user privacy (thought it’s not perfect by any means). I’ve been a long time Firefox user who has always cleared the caches regularly, and I’m curious to know if any browser has consistently provided more privacy out-of-the-box than Safari.
achairapartover 4 years ago
I wonder how long it will take for browsers to go beyond the cache concept and implement an integrated package repository so I can upload my manifest + my 3kb app.js and tell the browser to download (and store) all the dependencies I need.<p>It will not only help with performance, but will also stop the absurd tooling madness that front-end has become.
评论 #24895822 未加载
评论 #24895835 未加载
评论 #24896040 未加载
llarssonover 4 years ago
So the natural progression here is that only big sites with their own CDN solution will be fast? And for most people and companies that will mean &quot;rely on a large service to actual host your content for you&quot;, because they are not operating their own CDN. Because speed matters when it comes to search ranking.<p>So they are then beholden to major platforms such as Google to host sites for them from a global cache? Similar to what AMP does, but for all kinds of content?<p>Hmm.
评论 #24896384 未加载
评论 #24896332 未加载
评论 #24896275 未加载
k_szeover 4 years ago
Assuming all browsers are going to implement this partitioning, doesn’t it give web devs even <i>more</i> reason to use 3rd-party CDNs? You’re not paying for the traffic and you don’t have to worry about users’ privacy.
donatjover 4 years ago
What is the most nightmare case of private information leaking here? I can&#x27;t seem to come up with anything that horrible from my own imagining, especially not worth throwing away the advantage of cross domain resource caching.<p>The example that they give, that you&#x27;re logged into Facebook, doesn&#x27;t seem very useful other than maybe fingerprinting? But even then 90 some percent are going to be logged in, so the only real fingerprinting there is on the people who aren&#x27;t.
评论 #24896311 未加载
ThePadawanover 4 years ago
So what we actually need is<p>- a decentralized way to store these libraries<p>- by a source with established trust (so it can&#x27;t be misused for tracking)<p>JS&#x2F;CSS library blockchain?
评论 #24896202 未加载
评论 #24895899 未加载
评论 #24896487 未加载
评论 #24895840 未加载
评论 #24895686 未加载
评论 #24896289 未加载
arkitaipover 4 years ago
One of the benefits of using cdn resources is that it enables prototyping with, say, bootstrap so fast because you could essentially upload a single html file instead of a bunch of css, is,l and graphics. I mean, that will still be possible but there are more benefits to CDNs that just performance.
评论 #24895411 未加载
评论 #24896119 未加载
jacobrover 4 years ago
Even more food for thought - what if the cache is slower than the network? <a href="https:&#x2F;&#x2F;simonhearne.com&#x2F;2020&#x2F;network-faster-than-cache&#x2F;" rel="nofollow">https:&#x2F;&#x2F;simonhearne.com&#x2F;2020&#x2F;network-faster-than-cache&#x2F;</a>
dbrueckover 4 years ago
The negative effect is probably overblown; keep in mind that subsequent visits by the user to the same site can still use the cached version they loaded previously, and the odds of a cache hit in that case are relatively high.
viraptorover 4 years ago
I&#x27;m curious if this will result in any popular CDN folding. After a decent chunk of users update, they&#x27;ll be hit with much more traffic than usual - and possibly more than they can afford in some cases.
评论 #24896731 未加载
kreetxover 4 years ago
Perhaps using content security policy headers for trusted CDNs could fix this?
评论 #24895960 未加载
cblconfederateover 4 years ago
No mixed feelings, this is unconditionally good. Who ever thought it was ok to force your users to download stuff from unrelated, commercial servers
Cloudefover 4 years ago
ipfs and bittorrent v2 solves this problem by addressing content with hashes rather than URL.
评论 #24900991 未加载
ofrzetaover 4 years ago
At work we can no longer load stuff from CDNs anyway, because GDPR. For customer projects that is. I guess there would be the possibility to include it into some disclaimer but then we would need to check with the CDN about their data retention policy and check with the customer and that&#x27;s just not worth it.