The team at gov.uk is doing an excellent job regarding web performance, credit where credit is due. In many ways role model behavior.<p>Still, the results are such a stretch as to not have that much meaning. They have to descent all the way to 2G to see any meaningful difference, and I'm assuming they are cold visits (typical in lab-based testing).<p>For those exceptional users, this creates a difference from very poor (12 seconds) to still poor (9 seconds). Probably less because they'd normally have a warmed up cache.<p>Is it empathetic to improve performance for those users? Very much yes, do it as far as you budget allows for. But as it comes to jQuery specifically, the conclusion is that its negative impact is negligible.
Has anyone noticed how all UK government websites like <a href="https://www.gov.uk/" rel="nofollow">https://www.gov.uk/</a>, <a href="https://www.nhs.uk/" rel="nofollow">https://www.nhs.uk/</a>, <a href="https://tfl.gov.uk/" rel="nofollow">https://tfl.gov.uk/</a>, <a href="https://coronavirus.data.gov.uk/" rel="nofollow">https://coronavirus.data.gov.uk/</a> have the same look-and-feel and the same UX? Are there more such examples of countries that have uniform UX for their government websites?
That's nice but when you load the actual universal credit login page it needs nearly 400kb of reactjs - not sure about after that, don't have an account..<p>Some (most?) of the other gov uk sites have megabytes of js on them.. eg. <a href="https://coronavirus.data.gov.uk/details/testing?areaType=nation&areaName=England" rel="nofollow">https://coronavirus.data.gov.uk/details/testing?areaType=nat...</a>
Honestly if they'd just used jQuery instead of React all the way through they'd be loading a lot less JS and performance would be vastly better on those slow old devices. After all, jQuery was made for the slow old devices, in the time they were not slow or old.<p>The real problem is that jQuery is not new and shiny, and nobody wants to be doing jQuery in 2022.
> <i>Now I know what you may be thinking, that doesn’t sound like a lot of data, especially compared to images which can be multiple megabytes in size. But when it’s on every page, from a web performance perspective, it equates to a lot of data.</i><p>Er... the utility of jQuery in 2022 aside, I would say: if it equates to a lot of data when it's on every page (of a single website), you're doing something wrong?! I mean, caching content between different domains is not a thing anymore, but on the same domain jQuery should only be loaded once?
What I miss in this otherwise excellent article is more insight about methodology. There is just murky<p>> We run tests every day on GOV.UK pages using specific simulated devices and connection speeds. Because we can repeat these tests every day, it allows us to monitor what the changes we are making to the site are doing for real users visiting GOV.UK.<p>with no mention whether those were all cold start loads or simulated browsing scenarios.<p>Not wanting to diminish the effort or challenge the conclusions — I know that jQuery does quite a lot checks and feature detecting stuff when it initializes, even when it is not used (so removing it <i>must</i> have significant impact) — but if those measurements were done with cold start empty cache single loads each time, it might count some "extra" time (download, parse, tokenize, build AST) that usually precedes only initial load and "first run" of static library and is not present in subsequent interactions.
When I did web development, I specifically loaded JS lazily so first you got a pure HTML page with just a few lines of code. Those lines of code are normal (not jQuery) JS that loads the rest of the JS lazily in a progressive enhancement. This means the page can start being rendered before the jQuery code loads.<p>But in fact, jQuery is largely not needed since Internet Explorer lost market share and there's less of a need to have lots of workarounds for it.
Honestly I feel like jQuery is my "secret weapon" similar to how Lisp is Paul Graham's in Beating the Averages. It's so much more productive than vanilla JS with negligible impact on performance, especially with caching.
> especially compared to images which can be multiple megabytes in size<p>Please don't put images of multi-megabytes size on web-pages. My boss put a 10-megabyte image on the homepage of one of our clients; he fancied himself a photographer, and he took the photo. It gave a 10s page-load time from a desktop PC. I offered to shrink it for him, which would have taken 2 minutes; he wasn't having it.<p>I don't think it was pride in his photo; I think the client wasn't paying their bills, and he wanted to punish them.<p>But please, shrink your images to fit on the screen of your most-capable target device. There's no need for super-hi-res posters on websites.
> But when it’s on every page, from a web performance perspective, it equates to a lot of data.<p>How does browser caching come into play here? Doesn't it make a difference?
Maybe. JQuery reduces the number of characters you need to write compared to vanillaJS , so if their websites use some JS , it was probably a benefit. Now tell me about the megabytes of javascript that their new fancy frameworks download, <i>uncached</i>
"""
Since these users have the slowest devices, they’ll need the most help to make sure their visit to GOV.UK is as fast, efficient and as cheap as possible
"""<p>If only web developers cared about *all* users the web would be much less crappy.
> But when it’s on every page, from a web performance perspective, it equates to a lot of data.<p>Author of this article is apparently unaware of browser caches.<p>> JavaScript is known as a “render-blocking” resource.<p>Yeah, if only there was, like, an async attribute or something.<p>> The graph below shows time to interactivity dropped from 11.34 seconds to 9.43 seconds<p>So, jQuery is way too heavy for these people, but interaction tracking analytics (these packages usually <i>start</i> around 200kB) is perfectly fine?<p>> total page load time dropped from 19.44 seconds to 17.75 seconds<p>Burying the lede here, if my team celebrated a 17 second page load, we'd be fired on the spot. Going out on a limb here to suggest jQuery is the least of their problems.
I’m still a huge fan of jQuery. When I start a new web project then the first thing I do is add jQuery to the project. This is pretty much the only dependency that I ever use in my web projects. Nothing can beat the short “$” function and effects such as fadeIn, fadeOut, and others.
"But when it’s on every page, from a web performance perspective, it equates to a lot of data."<p>Caching? Set cache headers such that the jquery.js rarely/almost never expires and requires a re-fetch?
Couple of relevant previous discussions around this subject:<p>Gov.uk drops jQuery from their front end (90 days ago|437 comments)
<a href="https://news.ycombinator.com/item?id=31434434" rel="nofollow">https://news.ycombinator.com/item?id=31434434</a><p>How and why we removed jQuery from Gov.uk (5 days ago|43 comments)
<a href="https://news.ycombinator.com/item?id=32423422" rel="nofollow">https://news.ycombinator.com/item?id=32423422</a>
It’s so nice to know that someone still case about web performance. In a world where it’s quite normal to see website with static information and 1-5MB of compressed JS, which takes long time to load and makes your device hot
At the end he said a certain cohort of users, but how many users were actually in that cohort? JQuery is an older technology. I find it hard to believe something made to work on older hardware / internet is affecting performance that much. We have 5G and stupid fast chips in phones now. I’m skeptical. Performance issues are usually database queries or code compiling / “waking up” some server function.