TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Google Is Not God of the Web

403 点作者 davidblue将近 5 年前

40 条评论

gok将近 5 年前
&gt; 22.82 Mbps will reliably download very complex web pages nearly instantaneously.<p>The author may be unaware of how ridiculously huge web pages have gotten. I just loaded Wired.com, scrolled down and let it sit for a few seconds. It downloaded 96.2 MB, requiring over 33 seconds on one of those average connections. On a pay-as-you-go data plan, it would have cost about a dollar just to load that one page. The front page has about 500 words of content. It also covered 100% of the content with ads, twice.<p>This is unsustainable. Web developers have utterly squandered all efficiency gains of the last 30 years, and then covered their grossly inefficient work with impossibly annoying, privacy-invading advertising. Google should be applauded if they make these wasteful developers suffer monetarily until they shape up. They&#x27;ve already stolen untold amounts of time and energy from us all.
评论 #23387088 未加载
评论 #23385627 未加载
评论 #23385254 未加载
评论 #23385080 未加载
评论 #23386375 未加载
评论 #23385993 未加载
评论 #23389245 未加载
评论 #23386598 未加载
评论 #23387000 未加载
评论 #23387880 未加载
评论 #23386918 未加载
评论 #23389665 未加载
评论 #23389809 未加载
评论 #23389326 未加载
评论 #23388109 未加载
评论 #23385312 未加载
评论 #23385116 未加载
yongjik将近 5 年前
(Disclaimer: previously worked at Google search)<p>I think some commenters are attributing to Google an ulterior motive, whether ill- or good-intentioned, separate from its core business. But in this case no such motivation is necessary.<p>Basically, Google wants its users to be satisfied - otherwise it will lose to, say, Bing. So it measures user satisfaction - e.g., if a user clicks on a Google result, and immediately hits back button in three seconds, it&#x27;s a strong signal that the user was not satisfied. And Google tries very hard to increase this &quot;user satisfaction&quot; (and other similar metrics), because not only does it help Google&#x27;s business, but it also improves the service itself.<p>And, guess what? When a page takes fifteen seconds to load, lots of people hits the back (or close) button. Showing such a page <i>is</i> giving the user a bad experience. Unless there&#x27;s literally no alternatives, it makes sense for Google to &quot;penalize&quot; such a page.<p>Of course no metric is perfect, so it will occasionally backfire and penalize a great page that takes thirty seconds to load. But that&#x27;s life.
评论 #23389025 未加载
评论 #23388421 未加载
评论 #23389457 未加载
评论 #23423501 未加载
pornel将近 5 年前
There are lots of questionable ways in which Google owns the web (AMP, reCaptcha harassing users without Google cookies, Chrome&#x27;s &quot;Fire And Motion&quot; web standards strategy), but this one isn&#x27;t one of them.<p>In the webdev community it&#x27;s well known that good performance is very important for user satisfaction, and that&#x27;s backed up by research. There are no ideal metrics, and unfortunately every single one of them has some dumb edge cases. You could endlessly bikeshed which metrics could be better, but this particular set is not unreasonable.<p>It makes sense for search engine to pick websites that not only have the relevant content, but also are able to actually get that content on screen without frustrating users with slow-loading crap that makes browser freeze and stutter.<p>Keep in mind that your high-end desktop web experience is in minority. The web is now mostly mobile, and mobile is mostly low-end Android. That&#x27;s a shit sandwitch, and it desperately needs an intervention.
评论 #23385701 未加载
评论 #23385839 未加载
bosswipe将近 5 年前
What&#x27;s infuriating to me about these types of &quot;signals&quot; to the search rankings is that they have little to do with the content that I&#x27;m searching for. Google will hide results that I might find useful because the web master hasn&#x27;t kept up with whatever Google decided was today&#x27;s best practices. How about ranking based on the best source for what I&#x27;m looking for?
评论 #23384814 未加载
评论 #23385001 未加载
评论 #23384646 未加载
评论 #23386858 未加载
评论 #23384590 未加载
gumby将近 5 年前
Back in 1998 or so people weren&#x27;t just enthusiastic about the Google search engine because its results were good but because the search page was simple and fast.<p>Compare that to Altavista or Yahoo whose pages were belarded with all sorts of irrelevant links and ads <i>around</i> the search results. Slow to load and hard to visually navigate.<p>I still think the sparse pages are the best.
评论 #23385888 未加载
评论 #23384888 未加载
kickscondor将近 5 年前
&gt; There is a very reasonable argument for essential services like search engines and news websites to conform to&#x2F;adopt standards like AMP, but for the rest of The Open Web, ingenuity and risktaking should be encouraged, not discouraged, for the true good of all Peoplekind.<p>Hadn&#x27;t really considered this - because minimalist page size is often such a given - but, for instance, many amateurs often don&#x27;t know yet how to crush their pngs and such.<p>&gt; <a href="https:&#x2F;&#x2F;bilge.world&#x2F;open" rel="nofollow">https:&#x2F;&#x2F;bilge.world&#x2F;open</a><p>Cool - thanks for this!<p>(As an aside, it&#x27;s great to see a continuation of topics like this - which is commenting on last week&#x27;s article from Parimal Satyal. It makes this place seem more like a forum.)
评论 #23385962 未加载
评论 #23383851 未加载
NickHirras将近 5 年前
If you direct their web vitals tool to test itself (<a href="https:&#x2F;&#x2F;web.dev&#x2F;vitals&#x2F;" rel="nofollow">https:&#x2F;&#x2F;web.dev&#x2F;vitals&#x2F;</a>), the report isn&#x27;t great:<p><a href="https:&#x2F;&#x2F;lighthouse-dot-webdotdevsite.appspot.com&#x2F;&#x2F;lh&#x2F;html?url=https%3A%2F%2Fweb.dev%2Fvitals%2F" rel="nofollow">https:&#x2F;&#x2F;lighthouse-dot-webdotdevsite.appspot.com&#x2F;&#x2F;lh&#x2F;html?ur...</a>
评论 #23385895 未加载
评论 #23384308 未加载
addicted将近 5 年前
I think most people haven’t internalized that Google is no longer a search engine but an answering engine.<p>A search engine tries to find all sorts of relevant information related to your query. The more the merrier (it’s searching after all) and then sorted in a way that puts the relevant results first. An answering engine, in the other hand, tries to minimize the number of results. In an ideal world, it would only return one thing, which tells you exactly what you want.<p>One example of this change is the fact that it’s no longer useful to go beyond the first page or so of Your results. Because anything down that low is irrelevant as an answer and is probably discarded by google anyways, which wasn’t the case when it was a search engine.<p>I’m not saying this is a bad thing. In fact, I suspect the majority of time the majority of people want an answer, and not a multitude of results. But I think this is what leads to google search changing in a way that does not meet many people’s expectations here.<p>It means google emphasizes stuff that gets people answers quickly. They parse text and reveal the answers on their page itself. And they are not very useful for exploring anymore.
评论 #23386736 未加载
评论 #23389163 未加载
评论 #23385406 未加载
cj将近 5 年前
One anecdote where their “Largest Contentful Paint” metric fails, and fixing degrades performance:<p>We have a large 300kb animated gif that takes up maybe 20% of the viewport above the fold. The gif demonstrates visually what our service does.<p>A couple months ago Webmaster Tools reported that page as being “slow” pointing to the large image download. So we decided to show the first frame of the gif as a (30kb) png file, and then swap in the gif 2 seconds after the page is fully loaded.<p>Except now the new “largest contentful paint” metric is failing on those pages because it includes the 2 second delay when the animated gif is swapped in. I guess technically they’re not wrong in how they’re calculating it.<p>In fewer words, Google doesn’t like anything being lazy loaded if it’s above the fold.<p>The metrics and how they’re calculated are questionable. We ended up optimizing for Google and removed the lazy load (ignoring that we think it’s a better UX to lazy load that specific gif).
评论 #23384908 未加载
评论 #23385190 未加载
评论 #23385440 未加载
评论 #23385334 未加载
评论 #23385050 未加载
评论 #23386005 未加载
评论 #23385102 未加载
entropyneur将近 5 年前
As a web developer who has recently spent an ungodly amount of time trying to make my pages meet Google&#x27;s impossible standards for qualifying as &quot;fast&quot; on mobile, I sympathize with the author&#x27;s point. But I think he&#x27;s missing the even bigger picture. Personal computing is mobile now. And even though the phones have as many megabytes, kilotonnes, little clowns or whatever the device greatness is measured in these days, browsing the web on them is still slow as hell. And I would seriously entertain the suggestion that it&#x27;s all Apple&#x27;s evil plot if every Android phone I ever used didn&#x27;t suck donkey balls for browsing the web. Whatever the reasons for this are, what&#x27;s at stake now is not web&#x27;s diversity but it&#x27;s relevance altogether. I&#x27;d rather live in the world where web is needlessly optimized for performance than in the world of apps.
评论 #23388242 未加载
tschellenbach将近 5 年前
It&#x27;s funny that Google is so large, that one way to grow their business is to improve the user experience of the internet as a whole.
评论 #23384149 未加载
评论 #23383863 未加载
评论 #23383958 未加载
seanwilson将近 5 年前
&gt; web.dev is operating on some irritating assumptions:<p>&gt; 1. Smaller assets are ideal.<p>&gt; 2. Minimalistic design is necessary.<p>This doesn&#x27;t sound right to me. Aren&#x27;t the three new page metrics mostly targeting what happens when the page initially loads?<p><a href="https:&#x2F;&#x2F;web.dev&#x2F;vitals&#x2F;" rel="nofollow">https:&#x2F;&#x2F;web.dev&#x2F;vitals&#x2F;</a><p>&gt; Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.<p>&gt; First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.<p>&gt; Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.<p>The first two are about initial loading. For the last one, you can avoid layout shift by e.g. reserving space for images that are yet to load fully.<p>For example, it sounds like your page could load 100MB of images with the most complex design ever, and it would get a good score as long as the initial viewport of the page displays quickly, is interactive quickly, and doesn&#x27;t jump around as it&#x27;s loading.<p>They sound reasonable metrics to me in terms of fighting bloat but with flexibility for web developers (as opposed to AMP). Who gets to decide the metrics is another issue though.
neya将近 5 年前
If not Google, someone else must step in and set some standards. Either way, I don&#x27;t see anything wrong, their platform, their rules. Don&#x27;t like their rules? Don&#x27;t use them. When less people use their products, they will listen to what customers want.<p>Having said that, what exactly do customers want? They want the best experience on whatever device they&#x27;re on. This is 2020, there&#x27;s so much that has happened since the 1990s. We can&#x27;t simply keep using standards from the 1990s.<p>&gt; 22.82 Mbps will reliably download very complex web pages nearly instantaneously.<p>The author needs to come down from his high horse and use internet in developing countries. I was in India the other day for a client meet and I was on one of the largest networks there. I had a subscription to NYT and I tried to load an article and whoa, it took me 3 full minutes for the browser to fully load the article to the point where it was even barely readable. I&#x27;m not saying the network in India is slow, I&#x27;m saying, even with the best networks, when you&#x27;re travelling your speeds will be in KBPS. If we don&#x27;t have strict standards, the sizes of these pages will only grow and grow.<p>Later that day, I loaded the same article from my desktop again. The site made a gazillion requests in the network tab to so many advertising vendors and each of them had consistently sizeable assets. More than being offended for selling me out despite a paid subscription, I was offended how ridiculously unoptimized sites like NYT are, despite being a popular online, large scale publisher.<p>I&#x27;m happy such sites like NYT will be penalized if they don&#x27;t provide their users a good experience.
magicalist将近 5 年前
Not sure what this article is arguing.<p>Sometimes you want to make a slow website that doesn&#x27;t fit well on a phone screen?<p>Leaving aside the fact that you <i>can</i> of course do that, and that if I&#x27;m using a search engine on my phone I probably (usually?) don&#x27;t want to look at your slow site that I have to horizontally scroll...<p>&gt; <i>Modern web design principles are very rarely directed at regular people looking to make a website on something they are interested in. Instead, the focus is on creating websites that perform well:</i><p>&gt; <i>Don&#x27;t use too many colours. Write short, catchy headlines. Don&#x27;t let content be too long. Optimise for SEO. Produce video content, attention span is decreasing. Have a an obvious call to action. Push your newsletter. Keep important information above the fold. Don&#x27;t make users think. Follow conventions.</i><p>All that&#x27;s true to some extent if you&#x27;re making a <i>product</i> on the web and you have a few seconds to hook a customer before they move on. If you&#x27;re making a website for enthusiasts in some niche, though, content is your draw and you can worry less about some of these things.
评论 #23384333 未加载
评论 #23384458 未加载
评论 #23386955 未加载
skynetv2将近 5 年前
Anyone who can force the web developers to make more responsive, small, junk free, sites that focus on user than ads will have my support. I don&#x27;t see anyone else making the attempt to force a change. The author is mistaken in their views.
walshemj将近 5 年前
I thought this was going to be a reasonable article but its just another whine from a designer who wants to add another 6mb of pretty animation to a web page.<p>1 and 2 are totally wrong and 3 googles moving away from amp ant letting normal pages rank.<p>I have wasted to many hours of my time on conference calls with people like this.
gdsdfe将近 5 年前
I keep telling people that search shouldn&#x27;t be a monopoly but they just keep looking at me like I&#x27;m a crazy person
skilled将近 5 年前
Google Search is in a horrendous state right now. Search results have been getting worse each year, with interesting information being buried 20 pages down.<p>I really hope they have plans to improve this or find an approach that works as a middle ground for generic SEO content.
评论 #23383956 未加载
评论 #23384285 未加载
评论 #23384104 未加载
评论 #23384107 未加载
评论 #23383869 未加载
评论 #23383939 未加载
aabbcc1241将近 5 年前
Google Search is an index, it can do whatever decision it like. If we don&#x27;t like it, we can use other indices, even better, we can build more alternatives.<p>The web is a open network. Anyone can share content as well as indices. I know it&#x27;s mostly impossible to beat google, but niche indices has their place to shine.<p>For example, here, HN is an index of hand picked (mostly great) content, and there are multiple &quot;unofficial&quot; HN variances. See? the web is very diverse and free.
throwaway810将近 5 年前
I find this tweet about how Google approaches web standards illuminating. To quote:<p>&gt; 1. design a flawed API (it&#x27;s fine! APIs are hard)<p>&gt; 2. ship it in the most-used browser, despite objections<p>&gt; 3. get cross-browser working group to fix the API<p>&gt; 4. oops, too late, that would break the web<p><a href="https:&#x2F;&#x2F;twitter.com&#x2F;Rich_Harris&#x2F;status&#x2F;1220412711768666114" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;Rich_Harris&#x2F;status&#x2F;1220412711768666114</a>
评论 #23389517 未加载
verdverm将近 5 年前
Google has largely made the internet a safer and more efficient system by pushing standards through their market dominance.<p>Is this a bad thing?
评论 #23383734 未加载
评论 #23384075 未加载
评论 #23383716 未加载
评论 #23384037 未加载
评论 #23383766 未加载
评论 #23383776 未加载
skywhopper将近 5 年前
I agree with the general thrust of the article, but not a lot of the details. And then there&#x27;s this:<p>&gt; Our phones have as much RAM as my “studio” work desktop<p>This is unlikely to be true. From what I can find, the latest iPhone has 4GB of RAM, and Samsung is up to 8GB (and they are growing this stat fast to be sure), but no &quot;studio&quot; desktop made in the last five years is going to have that little.<p>&gt; 22.82 Mbps will reliably download very complex web &gt; pages nearly instantaneously.<p>This is definitely not true. It is true that the download time is not large, but between DNS and TLS latency and the fact that most &quot;complex&quot; web pages are built of assets from dozens of different servers, your actual wait time for the assets can be quite long. But even if you discount that, the render time is probably longer than the download time. If your page is that complex, I hope it&#x27;s very beautiful to look at.
urda将近 5 年前
And we are allowing it. From the dangerous tactic of allowing them to MITM all users via the AMP platform, to pushing their features only in their browser. A browser, which I may remind, was pushed to success by abusing their market position.<p>Had Google pulled this in the 90&#x27;s they would have been attacked like Microsoft.
rawoke083600将近 5 年前
I see a lot of hate for SEO - Sure it can be use for crappy things but so can a lot of business functions. I see a lot of complaints about ppl saying &quot;oh but this stupid page outrank company abc or the page they would expect.&quot; I am just thinking out loud... if we are willing to employ &quot;specialist&quot; lawyers, accountants, programmers etc in our business, why do we balk at employing a SEO specialist as a specialist business function?<p>There are many good seo companies(true, you have todo your home-work, but that is true for other service providers as well).<p>The dream is of course to not have to use them and just reply on Google and it&#x27;s good nature and or algorithms... But is that not saying.. Oh we will just relay on never getting sued and therefor never need a lawyer cause, crimes are bad ?
Animats将近 5 年前
Now go look at the page source for the home page of Google.
NetBeck将近 5 年前
The web is optimized for Google Chrome, not the inverse. It&#x27;s not surprising. Google is an advertising company with &gt;90% search market share. There is little competition in the economy, only product differentiation.
entha_saava将近 5 年前
Didn&#x27;t expect something as silly as this, especially the assertion that page sizes don&#x27;t matter and since internet speeds are going up, web people are free to use it.<p>I don&#x27;t see internet costs going down per MiB. Not everyone is on an unmetered connection. And users expect the large amount of bandwidth to be utilized for Netflix or something like that.<p>And how do you say Google is being evil for doing this? Even if there was no search monopoly and imagine there were two competing search engines. I think both of them would factor page load times in search results for better user experience.
arexxbifs将近 5 年前
&gt; The entire point of the web was to democratize and simplify publishing [...] But the iPhone&#x27;s [...] shitbox performance means we&#x27;re all sort of ready to throw that progress away.<p>A hilariously ignorant statement. Simplified and democratized publishing doesn&#x27;t require one smidge of javascript, not one pointless tracker cookie, not a single facebook pixel, no hideous autoplaying videos, not even a jot of CSS. It needs nothing but HTML and text, rendered plenty fast by any computer made this side of 1995.
robbrown451将近 5 年前
It would be nice if you could specify preferences. For instance, if you hate tracking and ads, you could tell google that and it would down-weight those results.
impalallama将近 5 年前
It thought I be in favor of whatever this article was arguing but then it went in a weird direction by arguing against certain things that are just considered good practice but its bad now because google is saying it. Like yes, I would consider it a good thing to have lighter websites. Unless your site is catering to some niche that <i>needs</i> to offer 1 to 1 perfect image quality there no reason to not compress your assets.
mrkramer将近 5 年前
Casual users don&#x27;t care about details in user experience, when you innovate casual experience they will consider using new search engine.
romaaeterna将近 5 年前
I recently began having a lot of problems with Chinese &quot;searchbot&quot; traffic for a website. Filtering it out, my load went down by two orders of magnitude. It made me wonder how much of the purpose of this sort of thing is SEO. And how much slower Google is making the web for everyone by ranking on it, and therefore encouraging the shenanigans.
评论 #23390281 未加载
DevKoala将近 5 年前
I wish there was a way to index the web which was not so susceptible to marketers gaming the ranking system; it is clear Google has lost the battle there. Moreover, I don’t believe it is in Google’s best interest to surface the best results for me, but those that will generate them the most revenue.
z3t4将近 5 年前
Many inlinks are good for search ranking - people create fake sites and spam. domain is important for search ranking - people pick domain after search keyword. Faster pages rank higher - people make fast websites. Quality content rank higher - like if that ever going to happen.
pretendscholar将近 5 年前
Is there a formal study available that examines the tradeoffs of google vs a search engine like ddg?
youeseh将近 5 年前
God, how? Like, in the monotheistic sense or the polytheistic sense? If we&#x27;re going poly, then Google is definitely one of the Gods of the web. Maybe even the top God, but there may be some competition there.
评论 #23385030 未加载
评论 #23383968 未加载
replyifuagree将近 5 年前
My favorite Google war story is implementing a web application using Google&#x27;s polymer, only to watch their Google bot choke on crawling the site. It took about a year for them to get their bot working.
yingbo将近 5 年前
Of cause Google is not God: Money is.
karakanb将近 5 年前
The article is an unexpected take on what I would have assumed &quot;the basics of web development&quot; that would not be argued against. I would like to touch to some of the points here, as the arguments of the article were not very clear to me:<p>&gt; The simple assumption that it is always better to have the smallest page possible – that images should be resized and compressed to hell and typography&#x2F;other elements should be few in number. I strongly agree with this statement overall, and the article doesn&#x27;t seem to provide any counter-arguments against it. We need to serve the smallest page possible because the larger the page, the more resources it consumes, it is that simple. Every _unnecessary_ byte added to a page literally translates to more storage for the page, more processing power to prepare the page, more data being sent on the wire, more data on the client-side to interpret the given data, and all these add up to more energy being used to consume that page and more resources being wasted. If I can remove one byte from a page, this is for sure a win for everyone, one byte is one byte, whether that saving is relevant considering the scale and the effort is a whole another discussion, and the claim was never &quot;send the smallest page at all costs&quot;. Considering the time and effort, if there is a way to send a smaller page, then do it, it is no different than turning the lamp off of an empty room, just on a different scale.<p>&gt; Instantaneous page loads should be priority over any other standards of measure for a web page – like interesting design, for instance. I have never seen such a claim anywhere before, needs citation. As a developer, I think the look and the feel of the pages are as important as performance or efficiency, and web on its own can be used as an art platform, which would make this whole point irrelevant. Again, the overall point is this: if you can offer the same thing with smaller pages witha reasonable amount of effort, do it.<p>&gt; Minimalistic design is necessary. I have never seen such a claim, needs citation. As a user, I prefer cleaner design over fancier things, but this is neither a &quot;rule&quot; nor the industry standard. There are various research being done on this topic and I am no expert on it, but a joint research [1] done by University of Basel and Google&#x2F;YouTube User Experience Research shows that the users perceive the cleaner designs as more beautiful, making the point that if the user perception is a goal for the given webpage, then keeping things simple might actually make a difference there. Again, depends on the use-case.<p>&gt; 22.82 Mbps will reliably download very complex web pages nearly instantaneously This is a pain I am living with every day. I have a 4 year old mobile phone with 6GB of RAM, and it takes at least 8-10 seconds for Medium to be usable over a ~100Mbps connection, combined with fetching the page and rendering &#x2F; interpreting it. This is exactly the point I was making above, if the page was smaller, it would have actually made a difference of seconds. The same device over the same connection at the same time opens the bettermotherfuckingwebsite.com in under a second, so, there is something to be seen there.<p>In addition to that, even if I had 1Ghz connection, a 1-byte waste is a waste, irrelevant of my connection speed. I am not talking of the effort of saving that byte, but it is important to acknowledge the waste.<p>&gt; Google has the right to dictate “Best Practices.” This is a point that I agree with more than the other ones, but this seems to be a separate topic to me. The previous arguments were against the practices themselves, and this one is against the entity that is supplying those practices. Even though I agree with the majority of the points there, it would have been a more informative read if the claims and frustrations there were stated with better point-by-point explanations and data to back those claims up. Google having a huge power and monopoly to push people to certain standards is a big problem, but it is not clear in the article whether the author is arguing against the practices or Google itself.<p>Overall, I believe it would have been a more resourceful article if the points and claims against the given practices were backed by better alternatives and data. We all accept that more data is more processing power + more energy, therefore trying to minimize it is an important goal, if the author thinks it should not be, then would be more interested in the answer of &quot;why?&quot; rather than a rant against long-standing practices.
verdverm将近 5 年前
This is the best place I know of to have thoughtful discussions and talk about what most consider taboo.<p><a href="http:&#x2F;&#x2F;www.paulgraham.com&#x2F;say.html" rel="nofollow">http:&#x2F;&#x2F;www.paulgraham.com&#x2F;say.html</a><p><a href="http:&#x2F;&#x2F;www.paulgraham.com&#x2F;resay.html" rel="nofollow">http:&#x2F;&#x2F;www.paulgraham.com&#x2F;resay.html</a><p>There are times where it gets heated, but those are quickly shut down by the very excellent community moderation.
评论 #23384664 未加载