Wow, did not see that coming. This article actually confirms the cynical hypothesis I entertain - that most of the "data-driven" marketing and analytics is basically marketers bullshitting each other, their bosses, their customers and themselves, because nobody knows much statistics and everyone wants to believe that if they're spending money and doing something, it must be bringing results.<p>Some quotes from the article supporting the cynical worldview:<p>--<p>"Most A/B testing tools recommend terminating tests as soon as they show significance, even though that significance may very well be due to short-term bias. A little green indicator will pop up, as it does in Optimizely, and the marketer will turn the test off. But most tests should run longer and in many cases it’s likely that the results would be less impressive if they did. Again, this is a great example of the default settings in these platforms being used to increase excitement and keep the users coming back for more."<p>This basically stops short of implying that Optimizely is doing this totally on purpose.<p>--<p>"In most organizations, if someone wants to make a change to the website, they’ll want data to support that change. Instead of going into their experiments being open to the unexpected, open to being wrong, open to being surprised, they’re actively rooting for one of the variations. Illusory results don’t matter as long as they have fodder for the next meeting with their boss. And since most organizations aren’t tracking the results of their winning A/B tests against the bottom line, no one notices."<p>In other words, everybody is bullshitting everybody, but it doesn't matter as long as everyone plays along and money keeps flowing.<p>--<p>"Over the years, I’ve spoken to a lot of marketers about A/B testing and conversion optimization, and, if one thing has become clear, it’s how unconcerned with statistics most marketers are. Remarkably few marketers understand statistics, sample size, or what it takes to run a valid A/B test."<p>"Companies that provide conversion testing know this. Many of those vendors are more than happy to provide an interface with a simple mechanic that tells the user if a test has been won or lost, and some numeric value indicating by how much. These aren’t unbiased experiments; they’re a way of providing a fast report with great looking results that are ideal for a PowerPoint presentation. <i>Most conversion testing is a marketing toy, essentially</i>." (emphasis mine)<p><i>Thank you</i> for admitting it publicly.<p>--<p>Like whales, whose cancers grow so big that the tumors catch their own cancers and die[0], it seems that marketing industry, a well known paragon of honesty and teacher of truth, is actually being held down by its own utility makers applying their honourable strategies within their own industry.<p>I know it's not a very appropriate thing to do, but I <i>really</i> want to laugh out loud at this. Karma is a bitch. :).<p>[0] - <a href="http://www.nature.com/news/2007/070730/full/news070730-3.html" rel="nofollow">http://www.nature.com/news/2007/070730/full/news070730-3.htm...</a>