This doesn't surprise me. The PageSpeed Insights site has a very atypical use case.<p>Anyway, the PSI 'score' is only a rough guide to help point you to problems. It can be kind of useful, in that if you get something like 97-100 then you can be pretty sure it's a very fast page... but it's also probably a page that doesn't do much beyond displaying something. With a more serious web app that does more interesting things, the score becomes a less useful indicator – it's often possible to make code changes that improve perceived performance but actually reduce the PSI score.<p>For example, it's impossible for PSI to really know if inlining a script is going to generally improve or worsen performance, because it doesn't know your site's usage patterns, or which parts of your UI need to render first for a user to feel that the your UI 'is fast', whatever that means. UI performance is a subtler art than, say, algorithmic performance, and much less quantifiable. That's why many people prefer webpagetest.org, which comes with much smarter tools to record and analyse <i>how</i> the page loads, so you can actually improve UI performance.
It does poorly when empty, but Google Pagespeed on Google Pagespeed does well on Google Pagespeed.<p><a href="https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fdevelopers.google.com%2Fspeed%2Fpagespeed%2Finsights%2F%3Furl%3Dhttps%253A%252F%252Fdevelopers.google.com%252Fspeed%252Fpagespeed" rel="nofollow">https://developers.google.com/speed/pagespeed/insights/?url=...</a>
Amazes me how much weighting people sometimes put in these type of tools, and start panicking because they enter a website they work on and it comes back with some red text.<p>Have had jobs where random marketing colleagues email the developers along the lines of "Why do we only score 7/10 in this test, 45.5% in this test, and 2 critical issues were found in this test?"<p>Don't get me wrong, they are useful, but people act the same way they would if their virus scanner found their computer was completely comprised. Unless the tools have a major tangible impact in something that affects the business, ie. SEO, I tend to ignore them unless they are a simple fix.
Also fails google.com on mobile. <a href="http://motherfuckingwebsite.com/" rel="nofollow">http://motherfuckingwebsite.com/</a> however is a 99/100.
The PageSpeed Insights tool gives itself [1] a high score [2]; what this link is showing is that it's giving its documentation pages [3] a low score. Not that that's much better.<p>[1] <a href="https://developers.google.com/speed/pagespeed/insights/" rel="nofollow">https://developers.google.com/speed/pagespeed/insights/</a><p>[2] <a href="https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fdevelopers.google.com%2Fspeed%2Fpagespeed%2Finsights%2F" rel="nofollow">https://developers.google.com/speed/pagespeed/insights/?url=...</a><p>[3] <a href="https://developers.google.com/speed/pagespeed/" rel="nofollow">https://developers.google.com/speed/pagespeed/</a>
Hilarious that this was just posted as I had to answer to it yesterday. Are we working together?<p>Here's an email I ended up sending yesterday in reply to a Google Page Speed Test report:<p>> We can address most of these issues with some further optimizations.<p>> One thing that will always appear in the Google Page Speed Test reports is the "Should Fix" issue with "Eliminate render-blocking JavaScript and CSS in above-the-fold content". This is a much discussed flag that Google returns that would really only work with non-modern websites, and a test that Google itself can't pass: <a href="https://developers.google.com/speed/pagespeed/insights/?url=www.google.com" rel="nofollow">https://developers.google.com/speed/pagespeed/insights/?url=...</a>.
'Fails' is a bit harsh. Since when was it decreed that Google should make all of their web pages blazingly fast on mobile? Al Gore didn't pass any laws on that.<p>If Google make a developer oriented tool then it is no surprise that it works brilliantly on the desktop and slightly sub-optimally on the mobile phone.
I am quite useless at web design yet I can game Pagespeed to get 95% or higher in the score. Of course the javascript will be mashed into some black hole that can't be un-minified and the way things load will not be well suited for the visitor seeing more than one page (using cached things). I can't believe someone at Google could not have done what I do to 'game' Pagespeed. The fact that they haven't is good, you should never let scores from things like Pagespeed or YSlow determine how a web page is delivered, it is like using a 'defeat device'.<p>I do wish they would do something to Pagespeed as they have changed what it does over the years and the latest iteration didn't excite me, I preferred the previous one, in part because you could run it against 'localhost'.
The Pagespeed Insights API is pretty easy to use, if anyone is interested in Pagespeed Insights as a service.<p>I built an Android app for it as sort of a proof-of-concept back when Holo was a thing, and it was very straight forward. I'm actually kind of surprised that Google didn't build one themselves.
cnn.com gets 95/100 on mobile user experience which must be a joke of some kind. My own page (the development version) gets a similar score in spite of loading much must faster. reddit.com also almost fails the desktop test.<p>This is a nice tool to give you suggestions about things you might want to change but it's hard to use it as a predictor of page quality.