This is awesome and solves a huge problem.<p>Every startup constantly gets told to A/B test everything, but nobody ever tells them that startups rarely have enough initial traffic to do any meaningful testing. As a result you end up spending a lot of work on what is effectively a random landing page or email template that might actually be worse than what you started with.
Hey! I'm the author and I'd be happy to answer any questions you might have.<p>Confidence.js is based on the A/B testing code that we use at sendwithus.<p>A/B testing math is hard and we've worked really hard on making this great. Let us know what you think!
Awesome work! Will look at replacing some hand rolled stuff internally. One gap I see with a lot of A/B testing analysis is that it only solves for conversion. While conversion is a great metric for many tests, in my experience, revenue is often the metric that matters most. Whether it's traditional eCommerce or selling tiered subscriptions, a lot of testing is geared towards 1) getting the customer to buy and 2) getting them into a more expensive product or plan. In the subscription scenario, some sort of customer life time value model is even better. I don't pretend to know all the math, but the calcs I've seen focused on revenue (AOV * conversion) need order level data (as opposed to aggregate) so it's not as easy to solve generically.
Was expecting this to actually be <a href="https://github.com/spumko/confidence" rel="nofollow">https://github.com/spumko/confidence</a>, from WM Labs. Strange both would choose the same name for A/B testing