Re: the Google results. Google uses another approach to backwards compatibility that devs may want to consider -<p>They branched the renderer codebase entirely, and then switch into the old renderer (which uses an API feed of search results that better be stable or else Google has bigger problems) for certain UAs. IE, non-evergreen browsers, and no-JS visitors get a renderer frozen in time in 2011; that's why it looks different from the current page. IE <6, pre-Mozilla Netscape, Lynx, WWW, etc. get a renderer frozen in time in 2007. Yes, you can still get the 10-blue-links experience if you set your user agent to Netscape 3 or something similar.<p>This has several advantages. Modern-day feature developers and QA testers don't need to <i>think</i> about ancient browser support - their code won't even be run by those browsers, and they aren't touching the code that will be run. Polyfills can be kinda hard to implement on really old browsers, they still run the risk of bugs, and they don't help at all if JS is unavailable. The usual downside of branching & copying code is the maintenance headache of keeping both versions in sync; this doesn't apply if one version is frozen in time forever and never maintained. It's relatively little work to generate this support (usually just need to branch the files, add in a if-statement, and ensure that backend APIs remain stable), and virtually no ongoing maintenance. Old browser users generally don't expect the latest & greatest features; if they did, they'd be on newer browsers. And old browser users don't pay the performance cost for new features, which can matter a lot when they're also on old computers and old connections.