Previous discussion:
<a href="https://news.ycombinator.com/item?id=10820445" rel="nofollow">https://news.ycombinator.com/item?id=10820445</a>
129 days ago with 1122 upvotes
While a nice article that I do agree with, it's not exactly accurate.<p>90% of website size is media (images, audio, video, etc). Discounting that and saying that a 1MB page is so much bigger than just the text it delivers is silly and doesn't really make a point. Maybe it's unnecessary, maybe not, but that's subjective and most people would rather have images and decent UI.<p>It's true that ads are another factor, and it's slowly being solved, but ultimately comes down to the wrong incentives across the industry. The more annoying/heavy ads work better and pay more so everyone from advertisers to agencies to publishers will continue to optimize toward them until something changes (which might be adblocking). This isn't a technical problem so much as business and politics.<p>However, the publishers these days *are pretty short on tech talent and mostly use off-the-shelf CMS systems like wordpress (which are already bad software) and then just layer on their plugins, themes, and ads to create the mess we have today. Again no easy way to solve that without talent either on the pub side or in the platforms themselves. Some progress being made here with things like facebook instant articles and medium.com hosting.<p>Overall, the web definitely has a cruft problem, but it's not really that bad considering all the various channels of information access, and it's all slowly getting better.
I usually have a fast connection and reasonable data limits available, so I don't worry too much about how much stuff my browser needs to pull down to render a website. However, there's another type of bloat that absolutely kills my browser and that is all of the JavaScript.<p>One of my computers is a netbook-ish Lenovo x131e from a few years ago. It runs everything I need it to very well, except for web browsing in Firefox. Many pages cause the browser to completely choke for a few seconds while the JavaScript does its thing.<p>I finally broke down and installed NoScript and it feels like I'm browsing the web on a brand new computer! Pages load fast and render quickly. If JavaScript is required, I can enable it on a site-by-site basis. There's also the privacy and security benefits, but my main issue was performance.<p>I used to think that people who browse with JavaScript disabled were being silly, but now I understand why someone would want to do that.<p>I know that part of the problem is Firefox and part of it is the under-powered computer. Browsing the same sites in Safari on a 6-year-old MacBook Pro isn't nearly as painful and, generally speaking, I leave the JavaScript alone on that computer.
Great stuff, I watched the video presentation too. The audience clapped at the part where he shows the pyramid illustration of the web - HTML followed by huge chunk of crap, then surveillance at the top. Obviously he is not alone in recognizing the problem.<p>I'm glad to hear Google are planning to label slow sites. We need this. Bloated websites need to be held accountable.<p>On mobile devices too, we need a browser option to stop loading any further data for a given site after a defined point. So if the browser has received 3 megabytes of a page so far, it stops and asks the user whether to continue with downloading. It might say how many cents this one website has cost them so far (if the user has setup this feature).<p>Fair enough "modern web, modern features" but most of the modern features we enjoy are improvements to browsers, servers and javascript. There's no reason why this can't mean keeping page size steady while enjoying new features.
Calling this a "Crisis" is definitely blowing this out of proportion. True, websites have grown by a lot over the past decade, and so has functionality, take a look at your favourite websites on the waybackmachine and see how god awful most of them looked just 6-7 years ago, compare the functionality they provided vs what we have now.<p>Yes I agree some websites do really need to go on a diet, loading unused css, js and multimedia has to be removed, but 1-2MB is not a reason anyone should be throwing their hands in air screaming CRISIS!! Definitely I would applaud anyone who would spend their time optimising their websites to be as small as they can be, but Jesus Christ 2BM and we have a crisis... no way!<p>I for one think this is a sign we are making progress a very good thing, we are improving as a community and we are getting more out of our web.
I love this article/presentation or whatever it is.<p>I often react very positively when websites are super-fast today. Especially on a mobile device, it is worth a lot to me.
I love this from the article:<p>These Apple sites exemplify what I call Chickenshit Minimalism. It's the prevailing design aesthetic of today's web.<p>"Chickenshit Minimalism: the illusion of simplicity backed by megabytes of cruft."
Just took over a website for a paying client, local small business. Found out their previous developer was some rockstar guy, thing was packed with frameworks and nuttiness.<p>Built a scraper and converted it to something using Markdown in a weekend, I barely used any of previous dev's "code."<p>Sometimes all you need to dig a hole is a shovel.
We need a decent @media equivalent in HTML.<p>We need a bandwidth @media selector (that is kept up to date with recent conditions by the browser).<p>We need optional (viewport conditioned) lazy loading built into the spec.<p>It's time for these features to become part of the web stack, but maybe we need a codified mediaQuery.js first.
Fun fact: Despite the hyper-cautious (2015) in the headline here, the AMP page described, the one that Google said they were going to fix, still redownloads the same video file every 30 seconds, thus making it, still, theoretically unbounded in page size.<p>The NPR page about ad-blockers which was 12 megabytes without an adblocker and 1 megabyte when using an adblocker is now 1.4MB with an adblocker and "only" 1.9MB without -- to display about 1,100 words.<p>The medium article that was over a megabyte seems to have removed the pointless 0.9MB invisible image.
I think it may be worth specifically excluding Images from some of these checks. Or adjusting for what can be compressed in the images themselves. Though it's enough to say that they can be a huge portion of a site.<p>That said, I was able to boilerplate Preact + Redux for creation of a control that will be used stand-alone and the payload is about 16k of JS (min+gz), and under 1k of CSS [1]. The methodology I used could very well be carried to more "full" applications. There's very little reason most modern web applications can't be way under 500kb payload (excluding images). In this case I wanted more modern tooling, workflow, but a fairly light datepicker modal... I feel most datepickers suck. Could it be lighter? yes[1], but I wanted a little bit of a different approach. In the end it works...<p>All of that said, the biggest points of code bloat are usually in bringing in an entire library instead of only the pieces needed, especially bad with UI controls... I really wish more people would use/extend bootstrap from source here. It's really easy to do... usually I copy the variables file, copy the bootstrap base file, create a mixins file, and then update all the references in the copied base. From there, I can comment out/in as needed, and be fairly light.<p>Of course, fonts are another source of bloat, I'd suggest people start leaning towards using svg resources embedded in JS strings, and only those icons needed... all modern browsers support svg well enough in this case. Other web fonts should be limited to 2-3 includes of 1-2 families... that's it. Any more and your design is flawed anyway.<p>With webpack + babel, it isn't so hard to keep your applications structured, and much more minimal.<p>[1] <a href="https://github.com/tracker1/md-datepicker" rel="nofollow">https://github.com/tracker1/md-datepicker</a>
[2] <a href="https://dbushell.github.io/Pikaday/" rel="nofollow">https://dbushell.github.io/Pikaday/</a>
Perhaps we need smarter tools and better cooperation. For instance, I bet a large part of code in large websites is shared with some other website. Why can't our tools figure out if this is the case, and then create some kind of "shared library" to be used by both websites.
Just take the time making <i>your</i> website slimmer. Do not care about the others and rape the benefits having more visitors than the others. It is egoist, but this is the only way companies are going to move, if the competition is getting more visitors because of slimmer websites. If this practically does not change anything, then, this is a false problem.<p>For my website (chemical databank) I was able to measure the benefits of reducing the page size with more visitors from countries with poor Internet connectivity.<p>So, just do it and enjoy the competitive advantage as long as you have it! This is the best way to get things moving.
Actual figures on the 'crisis'<p><a href="http://www.httparchive.org/interesting.php?a=All&l=Apr%201%202016" rel="nofollow">http://www.httparchive.org/interesting.php?a=All&l=Apr%201%2...</a><p>Average of 2.3MB with 60% of that being images. 50% of that content can be cached for a least a day.
For the last 20 years people have been saying "web pages today are too big." And for the last 20 years they've continued to get bigger.<p>The right size for a web page depends less on the opinion of random developers, and more on the website's owners and users.
Yeah, I did this<p><a href="http://andreapaiola.name/magpie/" rel="nofollow">http://andreapaiola.name/magpie/</a>
I was reading an astronaut's biography a few days ago, talking about the transition from the moon missions (where every half-hour was planned and accounted for) to Skylab (where people were working in the same place for long enough that it became necessary for them to have free time). And on one level it's a huge waste to have someone on the ISS where it's costing $x000/minute to keep them up there playing candy crush or whatever. But it's also a sign of maturity, that we are no longer desperately squeezing out every minute we can possibly get up there.<p>I think we're somewhere similar with the web. Internet is plentiful enough that we don't need to scrape and squeeze every last kilobyte. Maybe medium takes 3mb to render a simple plain page. That's ok.<p>(I do wish a lot of sites would up their information density though. Above all I wish I could get a full-width page, not a phone-width segment down the middle of my widescreen display. At home I've started using a stylesheet that disables max-width on all elements, and I've yet to see a site that looks worse that way)
As much as I want to agree with the article I really can't. All of the things that people claim as "bloat" are basically necessary for a website to work in a modern fashion...<p>Layout frameworks (eg Bootstrap) tackle the problem of developing a site that's readable at resolutions from 320px wide on a two year old phone up to a 5K iMac. That is <i>not</i> a trivial task, especially if you need UI components.<p>Application frameworks (eg Angular) make it <i>much</i> faster, and consequently a nicer experience, to maintain user state, load content, navigate around template driven pages, etc.<p>Media content has grown with resolutions and pixel densities. 10 years ago we were looking at websites on 1280x1024 displays, with no rich media. Today a consumer facing website has retina quality video. That's going to impact the page weight.<p>Being wasteful is a minor problem; everyone has fast broadband. Everything is cached at various layers from the browser to the service worker to the CDN to the origin server. Browsers are <i>really</i> fast. With some clever "cutting-edge-even-though-its-been-around-for-years" stuff like http2 you can fetch things in parallel.<p><i>Obviously</i> websites should be optimised. No one should be downloading media or libraries that aren't used. Animations should be hardware accelerated. Sites shouldn't be running in debug mode (<i>ahem</i>AngularJS<i>ahem</i>). But all in all, what we get in a browser these days is <i>far</i> better, <i>far</i> faster, and <i>far</i> more functional than websites were a decade ago. We could go back to the "works without JS, stateless on the clientside, roundtrip to the server and a whole new page for every click" world I learnt web development in, but I really don't want to. It was rubbish.