This is a very good basic A/B test, and while it is certainly way better than nothing and something that a lot of websites should try, I think there are some important caveats that I'm thinking of that aren't mentioned in the article:<p>1. How were the experiment groups chosen? You have to be really careful with SEO tests because some pages might get 100x the traffic of another page, so saying you split up 20,000 pages randomly isn't enough for a test like this. It is only meaningful if you split up 20,000 pages that had similar traffic profiles and were getting enough aggregate traffic to be able to notice an increase.<p>2. SEO tests take a long time so if you are doing it, I'd recommend more than 2 variations. You need to put the test up, wait for Google to index it, then wait a few weeks to see how the traffic changes. So since your turn-around time is at least a few weeks and maybe longer, try 4 or 8 variations if your traffic can support it.<p>3. I prefer my A/B tests to be a little more crazy, or at least try a crazy variation among more normal ones. In my experience the biggest gainers (and also the biggest losers) are ideas that seem crazy. Getting your feet wet with adding "(Example)" to the title is fine, but also try a crazy variation that is completely different and seeing how that does. Give yourself a chance to be surprised by your audience so you can learn more about them. And if the crazy variation loses by a lot, you have learned something important even if it isn't a winner.<p>4. The facts are a little dubious here. Traffic is up 50% over a timespan... there is no control. Traffic generally goes up for growing websites even if you don't do anything. You should also show your work on the 14.8% increase and give us some error bars. You say it is significant, how did you calculate this? It does seem like the test was better, but it is also important to make valid claims about it. Intellectual honesty when running A/B tests is <i>really really</i> important [1].<p>1: <a href="http://danbirken.com/ab/testing/2014/04/08/pitfalls-of-bias-in-ab-testing.html" rel="nofollow">http://danbirken.com/ab/testing/2014/04/08/pitfalls-of-bias-...</a>
> The number of users clicking on Coderwall from Google increased 14.8% (yes, this was statistically significant).<p>Would be very curious how they calculated significance here.