TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

We rendered a million web pages to find out what makes the web slow

187 pointsby KukiAiraniover 4 years ago

27 comments

chrismorganover 4 years ago
I’m curious about the HTTP&#x2F;0.9 stuff. Last time I checked (four or five years ago, by making a trivial HTTP&#x2F;0.9 server, as the first step of a “writing HTTP servers” tutorial series that never materialised), Firefox either wouldn’t render the output at all, or would render it as though it had been a text&#x2F;plain response (can’t remember which it was). In other words, it’s not a real web page. I would expect precisely zero occurrences among the top million pages, not the thirty that 0.003% indicates. I think it’s <i>far</i> more likely (p≅0.9999) that these indicate some sort of error.<p>(For those unfamiliar with it, HTTP&#x2F;0.9 is the label for a protocol where the client opens a TCP connection to the server and sends “GET &#x2F;”, and the server responds with the body, verbatim, and closes the connection. No status codes, no headers, nothing.)
评论 #25520804 未加载
ffpipover 4 years ago
Use an adblocker. I save about 25 gigs and hours of my time every month. *<p><a href="https:&#x2F;&#x2F;www.ublockorigin.com" rel="nofollow">https:&#x2F;&#x2F;www.ublockorigin.com</a><p>* I block much more than ads.
评论 #25518773 未加载
评论 #25518965 未加载
评论 #25518218 未加载
ru552over 4 years ago
<a href="https:&#x2F;&#x2F;catchjs.com&#x2F;Blog&#x2F;PerformanceInTheWild" rel="nofollow">https:&#x2F;&#x2F;catchjs.com&#x2F;Blog&#x2F;PerformanceInTheWild</a> — actual link to the people that did the study. Itnext.io just ripped it off with a small credit at the bottom.
评论 #25519162 未加载
评论 #25519983 未加载
评论 #25518999 未加载
评论 #25520529 未加载
评论 #25520109 未加载
kaycebasquesover 4 years ago
As jameslk also mentioned, this article mistakenly uses &quot;dominteractive&quot; interchangeably with &quot;time-to-interactive&quot;. They are not the same metric. Lighthouse uses a different algorithm for computing TTI.<p><a href="https:&#x2F;&#x2F;web.dev&#x2F;interactive&#x2F;#what-tti-measures" rel="nofollow">https:&#x2F;&#x2F;web.dev&#x2F;interactive&#x2F;#what-tti-measures</a><p><a href="https:&#x2F;&#x2F;developer.mozilla.org&#x2F;en-US&#x2F;docs&#x2F;Web&#x2F;API&#x2F;PerformanceNavigationTiming&#x2F;domInteractive" rel="nofollow">https:&#x2F;&#x2F;developer.mozilla.org&#x2F;en-US&#x2F;docs&#x2F;Web&#x2F;API&#x2F;Performance...</a>
jefftkover 4 years ago
<i>&gt;If we were to believe this, we would conclude that moving requested resources from HTTP 1.1 to 2 gives a 1.8x speed-up, while going from HTTP 2 to 3 causes 0.6x slow down. Is it really true that HTTP 3 is a slower protocol? No: A more likely explanation is that HTTP 3 is rare, and that the few resources that are being sent over HTTP 3 (e.g. Google Analytics) are things that have a larger than average effect on dominteractive.</i><p>If you wanted to measure the effect of protocol, you could compare requesting with support HTTP&#x2F;N to requesting with support for HTTP&#x2F;N+1. Since this is all synthetic testing it shouldn&#x27;t be too hard to run a controlled experiment instead of a correlational one.
Log1xover 4 years ago
Original article: <a href="https:&#x2F;&#x2F;catchjs.com&#x2F;Blog&#x2F;PerformanceInTheWild" rel="nofollow">https:&#x2F;&#x2F;catchjs.com&#x2F;Blog&#x2F;PerformanceInTheWild</a>
difosforover 4 years ago
If you try to derive which framework is used based on global variables then you&#x27;ll miss the ones that don&#x27;t pollute the global namespace like that but just use closures etc. E.g: most modern frameworks+toolkits like React+Webpack as far as I know.
ArtWombover 4 years ago
Not scope of this analysis, but my #1 regression is large PDF in tabs. It such a standard part of the web experience, especially in sharing academic research, that it feels like a legit pain point in dire need of an upgrade.
评论 #25518328 未加载
评论 #25524901 未加载
raszover 4 years ago
I can guess without reading:<p>1 ads<p>2 client side rendering. Youtube nowadays is 400-1000 kilobytes of Json + ~!8! megabyte of javascript(1). Pre Polymer (YT client side rendering engine update) you would receive 50KB of pre rendered pure HTML that would display instantly. Nowadays scrolling comments results in seeing them appear one by one while browser struggles with constant DOM updates.<p>1) &gt;7MB desktop_polymer_inlined_html_polymer_flags.js + 1.6MB base.js<p>Edit: After reading the article<p>&gt;Top linked URLs<p>That&#x27;s amazing. I&#x27;ve got the same combination on my luggage ^^^^^ tracking blocker, every single one of those (even the yt iframe one).<p>Nothing bout ads or tracking.<p>&gt;What marketing strategies does Itnext use? Get traffic statistics, SEO keyword opportunities, audience insights, and competitive analytics for Itnext.<p>oh, Itnext is all about that user tracking
评论 #25518586 未加载
评论 #25518520 未加载
评论 #25519522 未加载
lsllcover 4 years ago
I love this quote measuring the age of jQuery in different units:<p><i>JQuery was first released in 2006, which is 14 years ago in human years, but much longer in JavaScript years. Measured in Angular versions, it is probably hundreds of versions ago.</i>
zeropover 4 years ago
One more point I observed while UI performance debugging earlier: Even though JS are cached by browsers, still loading them from disk (cache) to memory during page load is actually slow. Of course it also depends on machine capacities.
评论 #25518302 未加载
sambeauover 4 years ago
I had to laugh—It took 5 seconds for that page to load for me.
jameslkover 4 years ago
It seems web performance is one of the least understood topics, even for engineers. Unfortunately even this author makes the mistake of confusing DOM Interactive[0] with Time to Interactive[1]. And yet this is a better analysis than what I often see repeated backed with no evidence.<p>For example, the myths about page weight and big SPA JS websites being the source of web performance issues is one I see so frequently here. Even in this thread others have started to claim the problem is all JS (take a closer look at the article). And it&#x27;s good to see some actual data to back what I actually have seen optimizing many websites myself (spoiler: a lot of badly performing sites use jQuery).<p>For speed (which is also a complex subject that can&#x27;t be captured in one metric[2]), the problem isn&#x27;t page weight or JS frameworks, it&#x27;s <i>network latency</i>[3]. Because you can&#x27;t optimize the speed of light. This is especially true for websites that connect to many different origins (usually for different types of analytics services) and for many things related to the critical render path[4].<p>The issues I see most often are not the huge frameworks loading, but inefficient loading of all resources, especially those that will affect user-centric timing metrics. I frequently see many render-blocking CSS and JS files loaded which increases First Contentful Paint. I see images, script code, and other resources that affect below-the-fold content loaded before above-the-fold content and resources. And of course my favorite: above-the-fold content loaded with JS. These affect Largest Contentful Paint. Etc etc.<p>Of course we can all claim the problem is something else and collectively decide to switch to server-side rendering as an industry but this won&#x27;t fix issues with web performance. Understanding how browsers load pages and how your users perceive the load will.<p>0. <a href="https:&#x2F;&#x2F;developer.mozilla.org&#x2F;en-US&#x2F;docs&#x2F;Web&#x2F;API&#x2F;PerformanceTiming&#x2F;domInteractive" rel="nofollow">https:&#x2F;&#x2F;developer.mozilla.org&#x2F;en-US&#x2F;docs&#x2F;Web&#x2F;API&#x2F;Performance...</a><p>1. <a href="https:&#x2F;&#x2F;web.dev&#x2F;interactive&#x2F;" rel="nofollow">https:&#x2F;&#x2F;web.dev&#x2F;interactive&#x2F;</a><p>2. <a href="https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;fundamentals&#x2F;performance&#x2F;speed-tools#common_myths_about_performance" rel="nofollow">https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;fundamentals&#x2F;performance&#x2F;s...</a><p>3. <a href="https:&#x2F;&#x2F;www.speedshop.co&#x2F;2015&#x2F;11&#x2F;05&#x2F;page-weight-doesnt-matter.html" rel="nofollow">https:&#x2F;&#x2F;www.speedshop.co&#x2F;2015&#x2F;11&#x2F;05&#x2F;page-weight-doesnt-matte...</a><p>4. <a href="https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;fundamentals&#x2F;performance&#x2F;critical-rendering-path" rel="nofollow">https:&#x2F;&#x2F;developers.google.com&#x2F;web&#x2F;fundamentals&#x2F;performance&#x2F;c...</a>
评论 #25519160 未加载
评论 #25520441 未加载
ablealover 4 years ago
&gt; tracking every conceivable performance metric<p>Technically it&#x27;s not a metric for one-of rendering, but memory leaks by an open browser tab have bugged me a bit lately.<p>Safari does a per URL report in MacOS, and the ancient MacBook with &quot;only&quot; 8GB RAM gets a tad hot and bothered when a local PC store page runs up a 1.5 GB tab just sitting there, GMail snarfs 1 GB if not opened in a fresh tab once in a while, etc.
these_are_not_tover 4 years ago
Tangential question: are there other giveaways than download time for a cached document which could be used by malicious scripts?<p>I ask because I don&#x27;t understand why a zero download time for a cached document couldn&#x27;t be simply masked by some (random) wait by the browser instead of downloading the file again.<p>From the chrome update page linked in the article, the explanation is:<p>&gt; However, the time a website takes to respond to HTTP requests can reveal that the browser has accessed the same resource in the past, which opens the browser to security and privacy attacks, [...]<p>which seems to indicate that only time matters in the attacks. Yet, the third bullet point suggests:<p>&gt; Cross-site tracking: The cache can be used to store cookie-like identifiers as a cross-site tracking mechanism.<p>as a possible attack based on the cache, which doesn&#x27;t seem to involve document download time.
评论 #25523885 未加载
pjc50over 4 years ago
I knew about googleanalytics; some years ago, back when browsers still had a status bar at the bottom, I noticed many sites waiting on it. So I redirected that domain to 127.0.0.1 and went on my way noticeably faster.<p>I&#x27;d not encountered Amazon Publisher Services but this article makes them look very bad.
评论 #25521078 未加载
superkuhover 4 years ago
It&#x27;s pretty obvious. Javascript. Web applications that display text and media are <i>slow</i>. Web documents that display text and media are fast.
inthewoodsover 4 years ago
Seems like jQuery is a big cause of slowdowns - anybody have suggestions for how to provide similar functionality without it?
评论 #25519478 未加载
评论 #25519662 未加载
评论 #25520042 未加载
评论 #25519410 未加载
评论 #25520786 未加载
jeffbeeover 4 years ago
The HTTP protocol level has got to be highly confused by being served from a high-performing AS. Won&#x27;t any technique appear to be correlated with faster load times if it happens to be served from e.g. the same EC2 region where they ran these browsers?
k__over 4 years ago
Pages with rocket-lazy-load are generally faster.<p>AFAIK Rocket is a Rust framework and my guess is that the average Rust dev cares more about perf. Which would imply that perf could be more about mindset than technology.<p>But that&#x27;s just my humble interpretation...
评论 #25519795 未加载
kaycebasquesover 4 years ago
See also the Web Almanac for an in-depth look at the state of the web: <a href="https:&#x2F;&#x2F;almanac.httparchive.org&#x2F;en&#x2F;2020&#x2F;" rel="nofollow">https:&#x2F;&#x2F;almanac.httparchive.org&#x2F;en&#x2F;2020&#x2F;</a>
woeiruaover 4 years ago
If 50% of the top million websites use JQuery, is it time to just admit that it should be bundled with JS by default in some way, shape or form?
评论 #25520782 未加载
评论 #25520724 未加载
Uninenover 4 years ago
Sad to see Sentry on the bottom 5 JS heap size list. Love their service but the JS payload is pretty massive.
marosgregoover 4 years ago
Just don&#x27;t use JavaScript lol
thefzover 4 years ago
Don&#x27;t tell me: it&#x27;s mostly ad media, and then a bunch of .js packages.
geggamover 4 years ago
Didnt read the article, my money is on javascript crap linked into the page from 200 different sites
jokoonover 4 years ago
Burn the DOM with fire. I&#x27;m so allergic to it. I&#x27;d rather write my own 2D tile tree format, with a simple flex-like rendering layout, than to work with HTML and expect reliability. HTML was never designed for interactive applications.<p>The historic reality is that HTML served as a &quot;backdoor&quot;&#x2F;gateway to let the linux community compete with microsoft: linux devs could ship their code for windows clients.<p>Now, HTML webapps cannot run properly on a mobile platform without depleting expensive batteries. So each mobile platform are making bucks on the back of developers who have to work twice as hard.<p>I have very high hopes that webassembly could somehow solve all those problems, but there are several problems:<p>* Not all browser vendors like WASM and I doubt that any WASM progress would be supported equally.<p>* WASM toolchains are not really mature for most languages. Compiler makers don&#x27;t have a lot of incentive working with WASM.<p>* The DOM would still exist.
评论 #25518308 未加载
评论 #25518268 未加载
评论 #25518286 未加载
评论 #25519264 未加载