TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The average size of Web pages is now the average size of a Doom install

857 pointsby Ovidabout 9 years ago

70 comments

jessriedelabout 9 years ago
I&#x27;m skeptical that developers talking to each other about how bad web bloat is will change anything. They will still face the same incentives in terms of ad revenue, costs of optimization, etc.<p>Here&#x27;s a random idea that might have more potential: create an adblocker browser plugin that also colors URLs based on how slow they are expected to load, e.g., smoothly from blue to red. The scores could be centrally calculated for the top N URLs on the web (or perhaps, an estimate based on the top M domain names and other signals) and downloaded to the client (so no privacy issues). People will very quickly learn to associate red URLs with the feeling &quot;ugh, this page is taking <i>forever</i>&quot;. So long as the metric was reasonably robust to gaming, websites would face a greater pressure to cut the bloat. And yet, it&#x27;s still ultimately feedback determined by a user&#x27;s revealed preferences, based on what they think is worth waiting how long for, rather than a developer&#x27;s guess about what&#x27;s reasonable.
评论 #11549800 未加载
评论 #11552162 未加载
评论 #11549131 未加载
评论 #11551176 未加载
评论 #11552846 未加载
评论 #11549063 未加载
评论 #11549615 未加载
评论 #11551708 未加载
评论 #11551004 未加载
评论 #11560804 未加载
评论 #11549599 未加载
评论 #11549314 未加载
评论 #11553478 未加载
robotnoisesabout 9 years ago
Before everyone jumps onto the JQuery&#x2F;Bootstrap&#x2F;etc sucks bandwagon, just a reminder that the minified jquery from cdnjs is 84.1kb. Bootstrap is 43.1kb.<p>If you want your page to load fast, the overall &quot;size&quot; of the page shouldn&#x27;t be at the top of your list of concerns. Try reducing the # of requests, first. Combine and minify your javascript, use image sprites, etc.
评论 #11549166 未加载
评论 #11549084 未加载
评论 #11549068 未加载
评论 #11549237 未加载
K0nservabout 9 years ago
Quite happy with my own web page&#x2F;blog. Pages hover at around 10kb, 30kb if I include some images. I think the page size can be attributed a lot to there being no JS except for GA.<p>I have taken a lot of inspiration from <a href="http:&#x2F;&#x2F;motherfuckingwebsite.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;motherfuckingwebsite.com&#x2F;</a> and <a href="http:&#x2F;&#x2F;bettermotherfuckingwebsite.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;bettermotherfuckingwebsite.com&#x2F;</a><p>Of course the size will differ depending on the site&#x27;s purpose, but I feel like most web pages could stand to loose a lot of weight.<p>EDIT: I have a guide to setup a similar blog&#x2F;site here[0]<p>0: <a href="https:&#x2F;&#x2F;hugotunius.se&#x2F;2016&#x2F;01&#x2F;10&#x2F;the-one-cent-blog.html" rel="nofollow">https:&#x2F;&#x2F;hugotunius.se&#x2F;2016&#x2F;01&#x2F;10&#x2F;the-one-cent-blog.html</a>
评论 #11549787 未加载
评论 #11549574 未加载
评论 #11549287 未加载
评论 #11549359 未加载
评论 #11549686 未加载
评论 #11553499 未加载
评论 #11550621 未加载
Tenhundfeldabout 9 years ago
Interesting comparison, if a bit arbitrary. It raises a couple of questions though.<p>1) How do the numbers come out when you exclude images?<p>It&#x27;s valid and good to know the total sizes, including images, but that can hide huge discrepancies in the experienced performance of a site.<p>For example, a page with 150KB of HTML&#x2F;CSS&#x2F;JS and a single 2.1MB hero image can feel very different from a page with 2MB of HTML&#x2F;CSS&#x2F;JS and a few 50KB images.<p>If we&#x27;re just interested in total bandwidth consumption, then sure, total size is a good metric. If we&#x27;re interested in how a user experiences the web, there&#x27;s a lot of variability and nuance buried in that.<p>2) What device and methodology were used to take the measurements?<p>In this age of responsive design, CSS media queries, and infinite scrolling&#x2F;deferred loading, it really matters how you measure and what you use to measure.<p>For example, if I load a page on my large retina screen and scroll to the bottom, many sites will send far more data than if I load them on my phone and don&#x27;t scroll past the fold.<p>I only skimmed the article and didn&#x27;t dig in to the references. These questions may be answered elsewhere.
评论 #11549103 未加载
评论 #11549541 未加载
评论 #11549557 未加载
seanwilsonabout 9 years ago
Lots of people are focusing on excessive JavaScript and CSS but these combined are easily dwarfed by a single high quality image.<p>Try visiting Apple&#x27;s website for example. I can&#x27;t see how you can have a small page weight if your page includes several images that are meant to look good on high quality screens. You&#x27;re not going to convince marketing and page designers to go with imageless pages.<p>Doom&#x27;s original resolution was 320x200 = 64K pixels in 8-bit colour mode. Even an Apple Watch has 92K pixels and 24-bit colour (three times more space per pixel) now, and a 15&quot; MacBook display shows 5.2M pixels. The space used for high quality images on newer displays is order of magnitudes higher to what Doom hardware had to show.
评论 #11550977 未加载
评论 #11550134 未加载
评论 #11565640 未加载
Kurtz79about 9 years ago
It is hardly surprising, considering that a single picture taken with an average smartphone is probably already surpassing that by quite a bit.<p>Times change, and 20 years in tech is equivalent to several geological ages.<p>If anything, it cannot really be underestimated how some developers were able to craft such compelling gaming experiences, with the limited resources available at the time.<p>My personal favorite as &quot;most impressive game for its size&quot;:<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Frontier:_Elite_II" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Frontier:_Elite_II</a>
Jerry2about 9 years ago
Maciej Cegłowski has a great talk&#x2F;writeup on this very problem:<p><i>The Website Obesity Crisis</i><p><a href="http:&#x2F;&#x2F;idlewords.com&#x2F;talks&#x2F;website_obesity.htm" rel="nofollow">http:&#x2F;&#x2F;idlewords.com&#x2F;talks&#x2F;website_obesity.htm</a><p>Here’s the video of the talk if you prefer to hear him speak: <a href="https:&#x2F;&#x2F;vimeo.com&#x2F;147806338" rel="nofollow">https:&#x2F;&#x2F;vimeo.com&#x2F;147806338</a>
seagreenabout 9 years ago
Oh God.<p>Every discussion about the web will continue to be a mess until we clarify what we&#x27;re talking about.<p>Let&#x27;s try rephrasing the title a couple times.<p>Rephrase 1: &quot;The average size of a webapp is now the average size of a Doom install&quot;.<p>Response: Interesting, but not bad! Heck, some webapps are games. &quot;The average size of a web game is now the average size of Doom&quot; isn&#x27;t a sentence that damns the web, it&#x27;s a sentence that complements the web! (or would if it was true, and it might be for all I know)<p>Rephrase 2: &quot;The average size of web document is now the average size of a Doom install&quot;.<p>Response: Well this sucks (or would if it was true -- still we don&#x27;t know). Simple documents should be a few KB, not the size of a game.<p>Basically our terminology is shot to crap. Imagine if 19th century engineers used the same word for &quot;hand crank&quot; and &quot;steam engine&quot;. &quot;Hand crank prices are skyrocketing! What&#x27;s causing this massive bloat!&quot; Whelp, that could mean anything.<p>The best solution: web browsers should enforce a clear distinction between &quot;web documents&quot; and &quot;web apps&quot;. These are two different things and should be treated separately. This won&#x27;t happen though, which leaves us (the rest of the tech community) to explore other options . . .
评论 #11553142 未加载
dreamlayersabout 9 years ago
In the late 00s I remember turning on an old computer with a 650 MHz Athlon CPU and being surprised that web browsing performance in Firefox wasn&#x27;t bad. Now if I try that with a 1 GHz Pentium 3, performance is absolutely horrible. Is this why?
评论 #11549112 未加载
评论 #11549056 未加载
spriggan3about 9 years ago
The average data plan here is 10GB :<p>1,000,000 * 10 &#x2F; 2250 = 4444 web pages a month<p>4444 &#x2F; 31 = 143 web pages a day at most on mobile.<p>While it is somehow acceptable, I don&#x27;t see data plans getting cheaper yet the size of the average webpage is raising fast.<p>It doesn&#x27;t seem like most websites have heavily invested in using HTML5 offline capabilities or actual mobile first design either, something easy to check with chrome dev tools.<p>Also let&#x27;s talk about ads : Polygon.com a site I visit often , first article on the homepage with an Iphone 5 :<p>- with ads&#x2F;trackers 1.5mb - without ads 623kb<p>More than half of the load is ad&#x2F;tracking related. This isn&#x27;t normal.
overcastabout 9 years ago
With the majority of users moving towards mobile, I really think this is an issue, and I&#x27;ve been consciously building projects as lean as possible. Removing bloated jquery libraries was a big one. With native calls like document.querySelectorAll document.querySelector I&#x27;ve found I can 90% get by without it. For the rest, using something like vue.js, and I&#x27;ve taken care of all the dom manipulation, data binding, etc.
评论 #11549767 未加载
评论 #11549001 未加载
评论 #11549000 未加载
warriorkittyabout 9 years ago
Oh, you just want to add a class to the element? \<i>adds whole jQuery\</i> That&#x27;s what&#x27;s wrong with the web.<p>Oh, and you need a loop? \<i>adds underscore.js\</i>
评论 #11549090 未加载
评论 #11549178 未加载
评论 #11549048 未加载
评论 #11549055 未加载
评论 #11549014 未加载
评论 #11549085 未加载
评论 #11549040 未加载
评论 #11549046 未加载
评论 #11549203 未加载
评论 #11549182 未加载
评论 #11549241 未加载
评论 #11549727 未加载
评论 #11549086 未加载
评论 #11549209 未加载
AndyKelleyabout 9 years ago
I wanted to see how one of my personal projects compared, so I looked at Groove Basin.<p>Groove Basin [1] is an open source music player server. It has a sophisticated web-based client with the ability to retag, label, create playlists, stream, browse the library, chat with other users, upload new music, and import music by URL.<p>I just checked the payload size of a cold load, and it&#x27;s 68 KB.<p>I&#x27;ll just keep doing my thing over here.<p>[1]: <a href="https:&#x2F;&#x2F;github.com&#x2F;andrewrk&#x2F;groovebasin" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;andrewrk&#x2F;groovebasin</a>
datalistabout 9 years ago
Not too long ago Medium pushed an &quot;invisible&quot; 1MB image to clients<p><a href="https:&#x2F;&#x2F;binarypassion.net&#x2F;digital-decadence-6ea59251d64d" rel="nofollow">https:&#x2F;&#x2F;binarypassion.net&#x2F;digital-decadence-6ea59251d64d</a><p>and the video it refers to <a href="https:&#x2F;&#x2F;vimeo.com&#x2F;147806338" rel="nofollow">https:&#x2F;&#x2F;vimeo.com&#x2F;147806338</a>
评论 #11564236 未加载
stegosaurusabout 9 years ago
If web bloat is a problem, I don&#x27;t think that looking at whether &lt;insert buzzword framework of CURRENT_YEAR&gt; can be removed is the answer.<p>I suggest that at the moment, we have basically two camps of website, with rough, fuzzy boundaries.<p>1. A place where someone sticks up an insight, or posts a wiki page, or whatever, to share some thought to others (if anyone actually cares). The blogs of many users of HN. Hacker News itself. Wikipedia. The Arch Linux Wiki. lwn.net. Etc. The sites are very roughly concerned with &#x27;this is what I care about, if you do, great, this is useful to you&#x27;.<p>2. Commercial web sites that employ sophisticated means to try and enlarge market share and retain users. AB testing. &#x27;Seamless&#x27; experiences which are aimed at getting more views, with user experience as an afterthought (a sort of evolutionary pressure, but not the only one).<p>Complaining that camp #2 exists is strange. It&#x27;s a bit like lamenting the fact that chocolate bars aren&#x27;t just chocolate bars, they have flashy wrappers, clever ingredients, optimized sugar ratio, crunchy bit and non crunchy bit, etc.<p>It works! A snickers bar is a global blockbuster, and &#x27;Tesco chocolate bar&#x27; is the functional chocolate bar that just does the job, but will never attain that level of commercial success, it serves a different role.<p>-----<p>My personal view:<p>Fundamentally what I want when we click a link from an aggregator, is an &#x27;article.txt&#x27; with perhaps a relevant image or two. Something like <a href="http:&#x2F;&#x2F;motherfuckingwebsite.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;motherfuckingwebsite.com&#x2F;</a> maybe.<p>But if a site actually does that, a website like The Guardian, I&#x27;d fire up wget, strip all the advertising, strip the fact it&#x27;s even The Guardian, and read it like a book. If everyone does it then no-one makes any money, site dies.<p>So what we actually have is this constant DRM-style race to try and fight for our brains to get us to look at adverts. It&#x27;s not about jQuery, it&#x27;s about advertising, branding, &#x27;self vs other&#x27; (the integrity of a company as a coherent thing), etc.<p>I don&#x27;t know what the answer is here. I think this is why I find concepts like UBI so appealing - I find it kind of alarming that we seem doomed to infect more and more of the commons with commercialization because we haven&#x27;t found a solution to keep each other alive otherwise.
评论 #11549437 未加载
评论 #11549530 未加载
forgotpwtomainabout 9 years ago
How about browser bloat? Each chromium tab on linux takes an extra ~50-150mb depending on the site -- and I still have no idea what they need all of that memory for...
评论 #11565774 未加载
评论 #11549284 未加载
dempseyeabout 9 years ago
I once bought a pre-made landing page template with all kinds of whizz bang Javascript libraries built in. The demo page was 4 MB. In the time it took to strip all the trash out of the template I could have designed the page myself. I&#x27;ll never do that again.<p>I wonder how much of the problem is due to bloated templates.
skarapabout 9 years ago
Looks like most of the discussion here is on network traffic.<p>Minifying JS and CSS, compression, CDNs and caching won&#x27;t keep your browser from having to render all the stuff.<p>---<p>The stewardess on a new jet airliner:<p>- Ladies and gentlemen, welcome aboard of our new airplane. On the second deck you&#x27;ll find a couple of bars and a restaurant. The golf course is on the third deck. You&#x27;re also welcome to visit the swimming pool on the fourth deck. Now - ladies and gentlemen - please fasten your seatbelts. With all this sh*t we&#x27;ll try to take off.
jordighabout 9 years ago
Just to clarify, since I was confused (I remembered that Doom 2 was about 30 megs uncompressed, which websites are still a long ways from), this metric appears to refer to the compressed size of the Doom 1 shareware distribution.<p><a href="http:&#x2F;&#x2F;www.doomarchive.com&#x2F;ListFiles.asp?FolderId=216&amp;ContentsFolderId=216" rel="nofollow">http:&#x2F;&#x2F;www.doomarchive.com&#x2F;ListFiles.asp?FolderId=216&amp;Conten...</a>
stepvhenabout 9 years ago
&gt; Recall that Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects.<p>Doom isn&#x27;t in true 3D, its an advanced raycasting engine. The levels are all 2D, there are no polygons, you can&#x27;t look up and down. Doom has been ported to a TI Calculator. Lets maintain some perspective here.
评论 #11551962 未加载
donkeydabout 9 years ago
Visited a website a few days ago, which used 2048x1365 jpegs for 190x125 buttons. They had multiple buttons like this on multiple pages. I sent them an e-mail about this, but I don&#x27;t expect them to fix it.
kgrabout 9 years ago
Send “models” rather than code. Low-level code is relatively unexpressive, contains considerable redundancy, and as a result, is relatively large. By sending high-level models instead, which are then expanded on the client to working code, application download size can be greatly decreased. Models typically provide one to two orders of magnitude of compression over code.<p>This video shows how we do it: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=S4LbUv5FsGQ" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=S4LbUv5FsGQ</a><p>This document gives some results (like a GMail client that is 100X smaller): <a href="https:&#x2F;&#x2F;docs.google.com&#x2F;a&#x2F;google.com&#x2F;document&#x2F;d&#x2F;1Kuw6_sMCKE72Ut0EsoPNFr9YKDkaw6DIfCDgExYkS2k&#x2F;pub" rel="nofollow">https:&#x2F;&#x2F;docs.google.com&#x2F;a&#x2F;google.com&#x2F;document&#x2F;d&#x2F;1Kuw6_sMCKE7...</a>
collywabout 9 years ago
I remember thinking years ago, that my CV in Word took up more memory than my first computer (Acorn Electron 32kb ram). It amazes me that I used to play Elete on that machine.
snowwrestlerabout 9 years ago
The Doom install image was 35x the size of the Apollo guidance computer.<p>Thirty-five times! Apollo software got us to the moon. Doom wasted millions of man-hours on a video game.<p>My point of course is that these comparisons are not actually that illuminating.<p>Are web pages much heavier than they need to be? Yes. This presentation very capably talks about that problem:<p><a href="http:&#x2F;&#x2F;idlewords.com&#x2F;talks&#x2F;website_obesity.htm" rel="nofollow">http:&#x2F;&#x2F;idlewords.com&#x2F;talks&#x2F;website_obesity.htm</a><p>Does comparing web pages to Doom help understand or improve the situation? No, not any more than comparing Doom to Apollo memory size helps us understand the difference between a video game and a history-altering exploration.
评论 #11549894 未加载
评论 #11549506 未加载
评论 #11549643 未加载
评论 #11551472 未加载
评论 #11550226 未加载
评论 #11550448 未加载
评论 #11549893 未加载
评论 #11550158 未加载
评论 #11549742 未加载
评论 #11550550 未加载
评论 #11550058 未加载
评论 #11550535 未加载
评论 #11552857 未加载
评论 #11551191 未加载
评论 #11549844 未加载
aorthabout 9 years ago
&gt; The top ten sites are significantly lighter than the rest (worth noting if you want to be a top website).<p>Wow. That&#x27;s nice to see actually.
评论 #11554147 未加载
jokoonabout 9 years ago
I just watched <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Q4dYwEyjZcY" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Q4dYwEyjZcY</a> this video about the early HTML standardization process, and it seems to explain all the ills of HTML.<p>So indeed, there is a huge optimization opportunity of having a stricter error model.<p>Also, I&#x27;m really wondering how much battery could be saved when surfing such pages.<p>Also I&#x27;m sure there is a lot of potential going in the pre-parsed document model. But that&#x27;s a next level kind of engineering I guess.
jakobdaboabout 9 years ago
Yesterday I discovered that Twitter&#x27;s HTTP headers alone are ~3500 bytes long (25 tweets!) with several long cookies, custom headers and the Content Security Policy[1] containing ~90 records. Is this considered normal nowadays?<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Content_Security_Policy" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Content_Security_Policy</a>
alasanoabout 9 years ago
I&#x27;m genuinely excited by ensuring great response times and minimal load on a website.<p>Locally I see so many companies building good looking but horrendously optimized websites for their clientele who don&#x27;t know enough to ask for it.<p>The last company I worked at were building a local search engine and were displaying thumbnails whilst loading full size pictures which were hot linked from businesses websites. With an auto loading feature at the bottom of the page by the php backend, an initial 5-6 Mb page load could turn into 30+ Mb within a few seconds of scrolling. Add to this no gzipping and caching was not properly configured either.<p>I tried my best to get some changes going but the senior (and only other) dev wouldn&#x27;t allow any modifications to the current system &quot;for the moment&quot;. It was a bit frustrating to see so many easy fixes ignored.
njharmanabout 9 years ago
So, were can I play Doom in the browser?
评论 #11549537 未加载
AdamNabout 9 years ago
The only real solution is a search engine that allows the end user to clip the results based on the maximum size of the total page. I&#x27;ve often wondered why Duck Duck Go doesn&#x27;t do this as well as filter search results based on number of ad networks used, etc...
perseusprime11about 9 years ago
Enable Ghostery and load cnn.com and you will see why the web pages are so heavy these days.
CaptSpifyabout 9 years ago
disclaimer: my own blog - <a href="https:&#x2F;&#x2F;blog.thekyel.com&#x2F;?anchor=Why_I_Block_Scripts_and_Ads" rel="nofollow">https:&#x2F;&#x2F;blog.thekyel.com&#x2F;?anchor=Why_I_Block_Scripts_and_Ads</a><p>I kept looking for a &quot;minimal&quot; blogging platform, but they all had too much bloat&#x2F;JS&#x2F;etc. I guess minimal means different things to different people. I ended up just writing my own. The biggest post I have is 7.41 KB.<p>I used to be interested in front-end design, but since it&#x27;s the industry standard to use $latest_framework, instead of tried and proven practices, I&#x27;ve given up on that idea.
评论 #11549899 未加载
bb85about 9 years ago
&gt; The top ten sites are significantly lighter than the rest (worth noting if you want to be a top website)<p>Isn&#x27;t that that the top websites have a lot more ressources available to improve asset management, cleanup and refactor?
评论 #11550231 未加载
Joofabout 9 years ago
From this title; maybe hacker news needs a twitter?<p>Also: You can&#x27;t use average page weight when you are just looking at the top ten. That downturn could represent a single website; all others could be increasing in size.
maerF0x0about 9 years ago
This all comes down to cost. It is much cheaper to have &quot;bloat&quot; than it is to pay devs to fix it. And customers find it much cheaper to deal with &quot;bloat&quot; than to find smaller alternatives. Sure the average webpage is bigger than doom, but the CPU in my phone is approximately 100x (times multicore too?) than the 486 that ran Doom.<p>Sure, if man hours were free, we could trim it all down to (my rough guess) about 1&#x2F;10th the size. But at $100 or even $10 an hour its just not worth it. Pay the GBs to your carrier, spend $50 more on a better phone.
评论 #11565765 未加载
chowesabout 9 years ago
Wondering if an idea like this would work:<p>Bundlers like Webpack already import JS in a modular structure. I&#x27;m wondering if we could do some profiling into popular npm module combinations (I know many people using React + Lodash + Redux Router, etc), bundle them up, and have Webpack load in those combos from a CDN via &lt;script&gt;?<p>Now this would probably require some work on webpack&#x27;s end (the __webpack_require__(n) would have to be some sort of consistent hash), but at least everyone who blindly require(&#x27;lodash&#x27;) will see an improvement?
评论 #11553887 未加载
jccalhounabout 9 years ago
and the .kkreiger beta only uses 96k! <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;.kkrieger" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;.kkrieger</a>
CM30about 9 years ago
It&#x27;s also about 2 times bigger than a lot of SNES and Mega Drive games. Or about 4 times bigger than Super Mario World (512KB).<p>As for why it&#x27;s getting so insane, probably either:<p>1. Frameworks, since most people don&#x27;t remove the code they&#x27;re not using. For Bootstrap or Foundation, that can be a lot of extra code.<p>2. Content Management Systems, since stuff like WordPress, Drupal, Joomla, any forum or social network script, tend to add a lot of extra code (more so if you&#x27;ve added plugins).<p>3. The aforementioned tracking codes, ads, etc.
hackertuxabout 9 years ago
I also recommend <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=10820445" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=10820445</a>
chasingabout 9 years ago
The only conclusion I can legitimately draw from this article is that in twenty years a single web page will be larger than the 65GB Grand Theft Auto V install.
sergiotapiaabout 9 years ago
Arguably sites have been increasing in size for one simple reason: It directly results in increased sales.<p>Everything is sales.<p>If cleaner, &#x27;purer&#x27; sites made more money you bet the average web page would be 10kb.<p>It&#x27;s all about what translates to more sales. As such, you won&#x27;t ever see a return to more traditional websites. Look at Amazon with it&#x27;s virtual dress models, heavy as hell, but they most certainly land more sales.
评论 #11565732 未加载
systematicalabout 9 years ago
Amazon isn&#x27;t what I&#x27;d call &quot;light&quot; at 4.7 MB, but looking at my companies market all the bigger players are way lighter than us.
webscalistabout 9 years ago
What happened to semantic markups? In the name of rendering optimizations, many web sites use css background image instead of &lt;img&gt;
apeaceabout 9 years ago
This page clocks in at 935kb in my browser. According to this same page, that is roughly the size of Sim City 2000.
intrasightabout 9 years ago
The publishers really have no incentive to address this until a critical mass of users install adblock software.
awqrreabout 9 years ago
It&#x27;s initially cheaper to make larger web pages, you don&#x27;t have to optimize for size (most of the time it would probably execute faster if it was smaller but probably not always). Some others make it larger on purpose for obfuscation (like Google).
sugarfactoryabout 9 years ago
Google developed SPDY, an efficient binary representation of HTTP messages. Maybe they will do the same thing but for HTML. It would be much more efficient if one could design a binary representation of HTML that can only express well-formed HTML.
评论 #11551569 未加载
bendbroabout 9 years ago
Why can&#x27;t all these frameworks just be cached? If a cross site request to cdn.com&#x2F;react-v1.0.js is cached under cdn.com, at most one download will trigger. That seems to solve the problem, but maybe I&#x27;m missing something.
评论 #11553804 未加载
JoeAltmaierabout 9 years ago
...and my first computer had 128 <i>bytes</i> of RAM. And a 300-baud modem.
评论 #11549301 未加载
myaredabout 9 years ago
GTA 5 is ~65GB in size. One day, web pages will be bigger than that.
jmnicolasabout 9 years ago
&gt; The average size of Web pages is now the average size of a Doom install<p>It&#x27;s not really surprising in a world where a graphical driver is &gt; 100 MB (Nvidia driver for Windows).
评论 #11549012 未加载
评论 #11549008 未加载
评论 #11549163 未加载
评论 #11549019 未加载
NelsonMinarabout 9 years ago
Is there a Chrome extenson that shows the size of a web page? There&#x27;s a good one for page load time that I use, but I want kilobytes with and without cache.
ivanhoeabout 9 years ago
Doom is not a good measure, when web pages become bigger and more bloated than your average printer or scanner driver, then it will be alarming :)
Khaineabout 9 years ago
Isn&#x27;t the answer to just create a reasonable standard library for javascript so people don&#x27;t need to link in megabytes of frameworks
LordKanoabout 9 years ago
Great, another kooky unit of measure.<p>&quot;This new re-design gets us down to 0.4 Doom installs without sacrificing any of the visual elements.&quot;
thomabout 9 years ago
There&#x27;s a reason that the economics of web development mostly work and the economics of games development mostly do not.
评论 #11549186 未加载
qaqabout 9 years ago
Too bad we can not measure it football fields
taconeabout 9 years ago
But they load faster than Doom used to.
damon_cabout 9 years ago
In 20 years the average size of web pages will be the size of a Quake 3 install. This is progress.
jrlabout 9 years ago
This is one of the reasons why I love a simple website without too many whistles and bells.
dclowd9901about 9 years ago
Are we really complaining about webpage size when fully 30% of web traffic is Netflix? This might be an unpopular opinion but websites are no longer just html, css and js. They&#x27;re full on applications with rich interaction and data visualization. Call me when they&#x27;re larger than an average modern native app install.
partycoderabout 9 years ago
For Internet 3 we should call John Carmack and put all of the internet in a MegaTexture.
hammockabout 9 years ago
Can anyone explain why a simple web page is so much bigger now than a whole game?
评论 #11549079 未加载
评论 #11549124 未加载
评论 #11549117 未加载
评论 #11549134 未加载
评论 #11549225 未加载
评论 #11549107 未加载
brownbatabout 9 years ago
ronan has an account here, commented on this as it was developing a few months ago:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=9981707" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=9981707</a>
Shivetyaabout 9 years ago
and here I remember that the PDF of the Turbo Pascal manual was so many multiples of the compiler&#x27;s size I needed a calculator to figure it out
flexterraabout 9 years ago
what is the average size of a native mobile app?
评论 #11550772 未加载
elcapitanabout 9 years ago
Doom would have been the superior solution to the World Wide Web.
imaginenoreabout 9 years ago
For the last project I built the initial page load with the absolutely minimal JS that was embedded into the page. Then it loaded the rest whenever it needed it. My coworkers were shocked how quickly the page loaded.<p>It&#x27;s actually better to show the user some progress bar, than the standard browser&#x27;s &quot;Waiting for yoursite.com&quot;.<p>You can get away with a lot without jQuery, while still having clean-ish code.
ebbvabout 9 years ago
I would argue two things:<p>1) This is an irrelevant statistic.<p>2) Even if this were true it&#x27;s not that big of a deal.<p>This is irrelevant because most people don&#x27;t browse the average web page. They browse the top few sites on the internet and that&#x27;s it. A more relevant statistic would be what have the sizes of the top 50 sites been over the last 15 years. I imagine they still may have grown on average, but download speeds have also grown over that time. Especially on mobile.<p>Even if we accept the premise that web sites as a whole, including the most popular ones are all growing and are now an average of 2.2MB each. Who cares? 2.2MB is nothing in 2016. Even on an LTE connection that&#x27;s probably between 4 and 1.5 seconds to download the full page. And a lot of that size is probably in ads, which nobody minds if they load last or not at all.<p>Lastly, this is a self fixing problem. If a site is too bloated, users will stop going to it.<p>But I would propose that a lot of this increase in size is due to users (especially mobile) having higher and higher resolution displays, which necessitates higher resolution content, which of course is bigger.
评论 #11549096 未加载
评论 #11549195 未加载
评论 #11549111 未加载
评论 #11549229 未加载
评论 #11549264 未加载
pljnsabout 9 years ago
The average Web page now does more than the average Doom install, I don&#x27;t see the relevance of this.<p>Although I get really annoyed when I visit a blog post whose page is 100x larger than Dostoevsky&#x27;s novels in .txt format. On my blog (<a href="https:&#x2F;&#x2F;pljns.com&#x2F;blog&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pljns.com&#x2F;blog&#x2F;</a>), JQuery and genericons are often my largest file transfers, but I still clock under 500kb.
评论 #11549238 未加载
评论 #11549035 未加载
评论 #11549043 未加载