TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tell HN: Google penalises unrelated groups of pages for bad CWV

1 pointsby adzicgover 2 years ago
TLDR - fix core web vitals even on pages you don&#x27;t care about for SEO, as the pages you care about can get unexpectedly penalised because of them. this changed somewhere mid-November.<p>We had a few low importance pages with core web vitals not really optimal, because they were not important and google wasn&#x27;t expected to send any traffic there. Until mid-November, this was not an issue at all. Google didn&#x27;t send any traffic to them, as expected. They didn&#x27;t cause issues for anything else.<p>Since 11 November, Google Search Console started reporting that core web vitals need to be improved for a group of pages, which included those low-importance ones, but also our homepage! The homepage CVW were already fully optimised, but google seems to have been taking an average metric across the whole group, for reasons beyond my understanding including the homepage there. We fixed the low-importance pages a week later, submitted for revalidation, which passed on Saturday finally (the console suggests up to 28 days to revalidate).<p>The end result is that the site got a massive traffic boost (more than 25% increase in clicks from google per day). The low importance pages still don&#x27;t get any traffic, but the homepage gets a lot more than before. There were no other changes to the site for a while, and the search console shows that 7 pages still needed optimisation last Saturday, so this is the only thing I can attribute the traffic boost to.<p>Conclusion: check the core web vitals in google search console even for low-importance stuff, google might be applying penalties across a group of seemingly unrelated pages now. either fix it or exclude those unoptimised pages from the index using meta tags.

1 comment

dazcover 2 years ago
If page content is so poor as to be considered unimportant then it&#x27;s best practice to noindex via x-robots.<p>Do not even think about robots.txt - step away now!