Nice writeup. But a large part of the main point seems to be surprise that client-side rendering isn't going to be indexed by search engines:<p><pre><code> > It occurred to me that the hype around client-side rendered apps
> is still alive and kicking. It’s not the first time I’ve heard
> server-side rendering negatively referred to as an “old school” technique.
and ...
> Chalk that up to my ignorance, believing the client-side rendered
> app-of-the-future hype, and not caring about SEO until the organic
> traffic numbers came in.
</code></pre>
This should <i>not</i> be a surprise. Obviously, if you only render content on the client, then search engines aren't going to see any of it. Client-side-only-rendered applications should only be for private pages, user's workspaces, and web <i>applications</i> — pages that a search engine will never see, regardless.
How about just sniff the user-agent and deliver perfectly rendered views/fragments of your SPA through a serverside PhantomJS process to crawlers?<p>If you used backbone routing properly this shouldn't be a problem.<p>Or, use one of the "Full-Stack" Frameworks for Javascript which merge the client and the server almost completely. Derby.js comes to mind, maybe even meteor.<p>[edit]: you mentioned PhantomJS in your summary, didn't see it. But you don't mention why you didn't use it.
I'm a bit afraid of this. Is it so hard for Google to run PhantomJS and crawl JS sites. If google reads CSS display:none correctly why it's so hard to make the JS crawling work... That's madness. This SEO ( Google ) stuff is making decisions for frameworks and tech-stacks. That's bad, it has to be the other way around. I don't want to make websites for Google I want to make them for the users.
The first con needs a bigger cavat because image size and number of requests play a huge part in browser rendering. This is probably why the mobile app was so much slower than the desktop app. The small minified js file that contains your app has little performance hit.