Don't believe the hype. Google has been saying that they can execute javascript for years. Meanwhile, as far as I can see, most non-trivial applications still aren't being crawled successfully, including my company's.<p>We recently got rid of prerender because of the promise from the last article from google saying the same thing [1]. It didn't work.<p>1: <a href="http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html" rel="nofollow">http://googlewebmastercentral.blogspot.com/2014/05/understan...</a>
So they're actually evaluating all js and css Googlebot is consuming. That's insane.<p>Can we forget about any new competitors in search engine land now?
Not only do you have to match Google in relevance you'll actually have to implement your own BrowserBot just to download the pages.
Wow, I built a project that rendered JS built webpages for search engines via NodeJS and PhantomJS. Rendering webpages is <i>extremely</i> CPU intensive, I'm amazed at the amount of processing power Google must have to do this at Internet scale.<p>I really hope this works, lots of JS libraries expect things like viewport and window size information, I wonder how Google is achieving that.
This was the missing piece for Polymer elements / custom web components. Now that Google has confirmed it's indexing JavaScript, web-component adoption should take off.
Gary Illyes @goog said this was happening Q1 this year, and like others mentioned lots of other direct/indirect signals have pointed this way.<p><a href="http://searchengineland.com/google-may-discontinue-ajax-crawlable-guidelines-216119" rel="nofollow">http://searchengineland.com/google-may-discontinue-ajax-craw...</a>
March 5th: Gary said you may see a blog post at the Google Webmaster Blog as soon as next week announcing the decommissioning of these guidelines.<p>Pure speculation but interesting... The timing may have something to do with Wix, a Google Domains partner, who is having difficulty with their customer sites being indexed. The support thread shows a lot of talk around "we are following Google's Ajax guidelines so this must be a problem with Google". John Mueller is active in that thread so it's not out of the realm of possibility someone was asked to make a stronger public statement.
<a href="http://searchengineland.com/google-working-on-fixing-problem-with-wix-web-sites-not-showing-up-in-search-results-233310" rel="nofollow">http://searchengineland.com/google-working-on-fixing-problem...</a>
Currently I use prerender.io and this meta tag:<p><pre><code> <meta name="fragment" content="!">
</code></pre>
I don't actually use #! URLs, (or pushstate, though I might use pushstate in the future) but without both of these Google can't see anything JS generated - using Google Webmaster Tools to check.<p>Does this announcement mean I can remove the <meta> tag and stop using prerender.io now?
Any idea how related this might be to Wix sites getting de-indexed?[1]<p><a href="http://searchengineland.com/google-working-on-fixing-problem-with-wix-web-sites-not-showing-up-in-search-results-233310" rel="nofollow">http://searchengineland.com/google-working-on-fixing-problem...</a>
This might be obvious to anyone who has done SEO, but can Googlebot index React/Angular websites accurately? I was always under the impression that the isomorphic aspect of React helped with SEO (not just load times.)