For JavaScript-heavy sites, I still recommend following Google's AJAX-crawling guidelines, and supporting the pre-rendered view request variable _escaped_fragment_<p><a href="https://developers.google.com/webmasters/ajax-crawling/docs/specification" rel="nofollow">https://developers.google.com/webmasters/ajax-crawling/docs/...</a><p>You can do a good enough job by sending such requests to a PhantomJS instance, waiting for the page to load, outputting the PhantomJS-rendered HTML to the browser, and saving the HTML to a cache for faster access next time.[1]<p>There are also plenty of SaaS apps that will handle the pre-rendering for you.<p>An additional bonus of doing this is that you can intercept requests from engines and services that don't support _escaped_fragment (e.g. Facebook external hit), and always ensure that you serve pre-rendered HTML to them. (e.g. by matching on the user-agent string).<p>[1a] One potential hitch is knowing <i>when</i> a page has finished rendering. You could potentially set a variable in your code and have PhantomJS wait for that.<p>[1b] One other hitch is that it's possible for PhantomJS to time out e.g. waiting for an external JS library to load. It's sensible to check that your HTML output looks vaguely sane before sending to the browser (e.g. for Angular, make sure there are no {{ }} blocks in the HTML), and sending a temporary error code if something looks odd.
I'm coming to the conclusion that SEO does not matter anymore, at least not this basic stuff as everyone is doing this, including your rivals, so it is really not going to get you to stand out from the crowd.<p>Search engines make their money from advertising your site, _not_ from sending you free traffic via SEO. So fire up your credit card and buy some Adwords in key search terms for your business, that's want they want from you. I believe we are already in the post-SEO age of the Internet, maybe we have been for a few years now.
Some additional info...<p>1. Don't let the new HTML5 tags distract or confuse you. Google knows how to deal with traditional DIVs so only use the new tags (example: SECTION, ASIDE) if you know how to properly implement.<p>2. Maintainability is a big one for page speed optimizations. You can easily get 95% there but that remaining 5% often comes at the expense of maintenance, and the ability to see/make changes easily.<p>3. Can't agree more with the section on redirects. 301's are often forgotten and can cause issues down the road with SEO and site functions.<p>4. The last tip I would add would be to monitor 404/access logs. Often times they will reveal SEO problems.
There is some good technical execution advice here, but the basic premonition is wrong:<p>"From a development point of view SEO is the concern of how well a robot can read and understand your content. As we will see, a robot being able to read your content easily is normally a good thing for humans too."<p>I don't think we need different points of view of what SEO is. SEO is about getting your page ranked as high as possible for whatever keywords you're targeting. Technical execution is a tactic that supports that, not a different view of it.
If you want to see how Google ranks your site speed and user friendliness check out google page speed insights. Very handy tool for guidance on what's slowing down your site.<p>A constantly updated and properly marked up site map is also good to have.
I'm surprised he didn't mention using kebab-case vs snake_case in his URLs section.<p>Google recommends kebab-case and will afford a small SEO bump accordingly.
This is all basic stuff, but won't affect your rankings that much. Yes, everyone should have it in place, but doesn't represent the state of SEO.<p>What you <i>should</i> know about SEO, is that it's all about getting organic links to your content, and having content that has a semantic connection to the keywords being searched for.
This is also a pretty good set of guidelines by Google to follow: <a href="http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf" rel="nofollow">http://static.googleusercontent.com/media/www.google.com/en/...</a>
Where is the evidence presented that supports any of these claims? I realize most of the seem like common sense but as far as I know many are in fact false. In particular, the argument about semantic markup seems pretty dubious. Currently, I'm not aware of any search engine that cares about new HTML5 semantic tags.<p>On that note, is there even even one example of assistive technology that uses semantic markup? As I understand it ARIA tags are needed for this, semantic markup is irrelevant since most (all?) assistive technology is outdated as far as browser technology goes.
For basic on page SEO when you are building a site, this is all good and useful information.<p>Developers should also know that creating good technical content and open source projects on your site will over time generate links and attention for your website. While those links aren't going directly to your product pages, the long term benefit of those pages linking back to your main company pages/products will help those pages rank, which makes you money. It's a bit indirect, but still is worthwhile.
Does anyone have any real evidence to support higher rankings for an increase in speed? Like, if I increased my speed from 5s to 4s page loading, I saw X amount of increase on traffic.<p>Reason being - all too often I see SEO developers use this as a hook to sell their services to clients to be able to get into their codebase.
This talk makes so much sense.
<a href="https://www.youtube.com/watch?v=X1nNkS4SVRw" rel="nofollow">https://www.youtube.com/watch?v=X1nNkS4SVRw</a><p>I first thought the article is written on similar lines of sarcasm.