Dear Optimizely: Your statistics do not tell you that B has a 95% chance of being better than A. Your statistics tell you when the excess of B over A has less than 5% probability, assuming B and A are actually equally effective.<p>A Bayesian would understand this in terms of prior probabilities and likelihood ratios, but to put it into nontechnical terms, suppose that you tried out 15 different alterations and none of them seem to work. Then on the 16th, your detector goes off and says, "Less than 5% probability of these results arising by chance!" Do you conclude that it's 95% likely that this version is genuinely better? No, because the first 15 failed attempts told you that improving this webpage is actually pretty hard (the prior probability of an effective improvement is low), and now when you see that the 16th attempt has a result with a less than 5% probability of arising from chance, you figure "Eh, it's worth testing further, but probably it <i>is</i> just chance."<p>Another extremely important point is that the classical statistics you learned to use to decide that something was <5% likely to arise by chance, only apply if you decided in advance to do exactly that many trials and then stop. Your chance of finding, on <i>some</i> trial, that your running total of results is "statistically significant", when A and B are actually identically effective, is <i>considerably greater</i> than 5%. See <a href="http://lesswrong.com/lw/1gc/frequentist_statistics_are_frequently_subjective/" rel="nofollow">http://lesswrong.com/lw/1gc/frequentist_statistics_are_frequ...</a> - a trial I ran with 500 fair coinflips had at least one step where the cumulative data "rejected the null hypothesis with p < 0.05" 30% of the time.<p>You're not really to blame for this mistake, because the horrid non-Bayesian classical statistics taught in college are just about <i>impossible</i> to understand clearly; but it does sound to me like someone at your org needs to study (a) Bayes's Theorem (b) the case for reporting likelihood ratios rather than p-values (likelihood ratios are objective, p-values decidedly not) and (c) the beta distribution conjugate prior (which would make progress toward having priors and likelihood ratios over "These two pages have a single unknown conversion rate" or "These two pages have different unknown conversion rates"). Or in simpler terms, "Someone at your company needs to study Bayesian statistics, stat."
They increased the number of people submitting their URL, there is no telling if that actually resulted in higher leads for them.<p>What I have found is a simple landing page, that tells the user exactly what you are providing, and is free of any confusion, works the best over the long term.<p>I've run hundreds of thousands of website visitors through Google Website Optimizer in multi-variate tests and what I've found is that over time there is little to no difference in conversion rate for minor landing page changes. The biggest jumps come from eliminating content in the design and clarifying the message.<p>Looking at the small amount of users they sent to this landing page, I would call the results inconclusive. You can ramble off statistics to me all day long, but you can't change the fact that humans don't behave when the predictable that coin flips and physics do. (its really chilling when you see how many drugs the FDA has approved over tiny margins of change/success.)
As always you need to be wary of how these results are reported.<p>AJ already pointed out that they're not measuring "conversions" in the sense of converted to paying customers, but converting to "entering a url in a field"<p>The 29% increase is always misunderstood (by clients at least)<p>The original page had a conversion rate of 8.9%<p>The page they ended up with had a conversion rate of 11.5%<p>The change is that an additional 2.6% of customers are now entering their urls in a field and clicking a button.
This is sorta related but since I concluded the test today and this post is here, I thought I'd share.<p>I ran a 5 way split test for 9 days on my newsletter's signup page (JavaScript Weekly). My original page was the 2nd best performing but an identical page just <i>without</i> the subscriber count got a 8% higher conversion (or about 20% more signups in all) with a 90% confidence at the end of testing.<p>The worst performer? A signup page with no screenshot preview of the newsletter. Sent conversions from about 37% down to a mere 3% (!!) Lesson learned? Always have visuals or screenshots on pages where you're trying to get people to sign up for things they aren't sure about.
Does anyone know of a central repository for all of these little landing page optimization tweaks? I know each site is different and just to test to know for certain but there should be some general decisions vetted that would make a good boilerplate.
We've been using Optimizely more and more extensively (we have a slightly unusual use case for it), and it's been <i>fantastic</i> at successfully ratcheting up conversions. We use it in concert with mixpanel when we need to push people along a funnel.<p>The tool's insanely easy to implement, a joy to use, and I get to rely on them to tell me when something is statistically meaningful.
It would be very cool if Optimizely could gather the A/B test results across all customers, run some statistical analysis, and publish which optimizations are significant. Sort of like an OKTrends for websites.<p>That would be a great resource for initial usability and design decisions. After which, can be tweaked and further optimized by their product.
That "Enter Your website URL" field is really annoying. The easiest way to change the default "<a href="http://www.example.com" rel="nofollow">http://www.example.com</a> to your website's URL would normally be to double click the "example" and type your address, leaving the boilerplate "<a href="http://www." rel="nofollow">http://www.</a> and ".com" and such.<p>But you can't do that due to the fancy javascript and everything. Have to type the whole thing yourself.<p>If the field <i>cleared</i> when you gave it focus, it wouldn't really matter and I'd just type in the URL myself. But the text remains, in the background, taunting you. It even appears to highlight the "example" part if you double click it.
I got a lifetime free Optimizely account as part of an AppSumo deal. I can successfully say it's one of the simplest yet most powerful and effective web products I've ever used.<p>The ease with which you can make changes is astonishing and there's no limits if you know a little jQuery.
Kinda out of topic, but I tried out top websites that use long-polling (like Quora), and it always fails. I guess the app waits for all the resources to load completely, which for long-polling websites, happens rather long after the DOMready event.
I wish I could apply Optimizely AB Tests in everything I do. Like measuring the best way to word movie choices so that the one you secretly want gets chosen (conversion)!<p>Anyway, interesting statistics. Larger sample size I think would definitely be more dramatic.