Hello, CEO here. We had been working on a completely new version of Visual Website Optimizer (called VWO). Here's the launch post: <a href="https://vwo.com/blog/launching-new-vwo/" rel="nofollow">https://vwo.com/blog/launching-new-vwo/</a><p>Happy to answer any questions on what's new and what's our vision of A/B testing.
Congrats Paras! New dashboard is looking nice, and I especially like the ability to retroactively segment results based on different customer dimensions and discover new opportunities for personalization.<p>One thing that I've noticed is that traditional A/B testing is a pretty sub-optimal way of answering the question: 'What works better, A or B?'<p>In the most basic example of an A/B test, you have a variation A and a variation B each shown to 50% of your user base. By definition, this approach will be sending half of your users to a worse performing version during the entire duration of the test!<p>The automated approach is based on a bandit algorithm that dynamically updates the proportion of users shown a given variation. With each new piece of data that you collect on the test variations' conversion rates and confidence, the algorithm adjusts the percentages automatically so that better performing variations are promoted and worse performers are pruned away.<p>This leads to:<p>1) faster results, because your directing test resources (i.e.: users and their data) to validate what you actually care about (i.e.: confidence in the best variation’s performance)<p>2) a higher average conversion rate during the test itself, because relatively more users are being sent to the better performing variation automatically, and<p>3) less time and effort required to actively manage your experiments.<p>Though the math behind this approach is slightly more complex than a traditional A/B test, it’s a no-brainer for those that are really interested in making data-driven decisions because of how much better the results are that it produces.<p>For anyone interested, here’s a post we put together on how it works: <a href="http://splitforce.com/resources/auto-optimization/" rel="nofollow">http://splitforce.com/resources/auto-optimization/</a>
The #1 problem I've had in using external A/B testing platforms is that they allow flashes of the old content before the new is shown (if perhaps only for some users). We've seen huge problems with conversion when testing with those tools which we don't see when running A/B tests using server-side implementations.<p>If Wingify/VWO has solved this once and for all, it'd be <i>wonderful</i>.
Which version of the site would Webcrawlers and indexing engines see? I'm just trying to work out whether this kind of tool has applicability when researching SEO tweaks on a site.
just a heads up. links on wingify.com are giving a 404
<a href="http://wingify.com/about" rel="nofollow">http://wingify.com/about</a>
<a href="http://wingify.com/careers" rel="nofollow">http://wingify.com/careers</a>