Has anyone done any A/B testing on SaaS conversion rates for the number of free trial days, e.g. 30 vs 45 vs 60-day trials?<p>FogBugz and Basecamp both offer 45-day trials, but 30 day trials also seem common.<p>I'd argue that 1 day should be enough to demonstrate your product's value, but I have no data to back this up with.<p>Is there some aspect of hitting/missing a billing cycle? Why is a 45-day trial the sweet spot?
It really depends on what kind of value that your service is offering.<p>For instance, Pivotal Tracker offers a 90 day free trial. Their product requires that the users/teams become dependant on their service so that they can't live without it.<p>Offering a 1 day trial here, when it works on weekly iterations just wouldn't work. Equally, offering a 30 day trial, really only provides 4 iterations, or probably about 20-30 stories. This definitely isn't enough to "hook" the users and lock them in due to the amount of time that they have invested.<p>If you are in it for the long haul and your service really does provide great value for the users then getting your users invested in the product might come at the cost of giving more free trials away for free.<p>Some services are freemium remember. They are unlimited trial times in effect, because the upsells and bolt ons are where they make money.<p>Equally, however, some services 45-60-90 days would definitely be too much, and you could be losing money.<p>You can't just say in isolation, 45 days is the sweetspot for 100% of startups, because it ignores any other factors such as the service value that the product fills.<p>Equally, if your product takes time to kick in, you wouldn't want to offer 30 day trial. For instance, if you were a marketing platform like HomeAway or a <i>paid</i> version of AirBnB. Offering a short trial might not be the best idea because it takes time for the value of the service to really kick in.
When I launched <a href="http://bugmuncher.com" rel="nofollow">http://bugmuncher.com</a> I offered a 7 day free trial. Not too sure why, I think with it being a very MVP launch I wanted to find out if people would be willing to pay as soon as possible.<p>Later I A-B tested 7 days against 30 days, and found 30 days did perform better, but not actually by that much. The 7 day free trail had a 9.95% conversion rate, where as 30 day had an 11% conversion rate.<p>I personally was expecting 30 days to do much better, but after seeing these results I suspect the length of the trail isn't very important to users, they just want to know that they <i>can</i> try before they buy.<p>I may try A-B testing 45 days (and maybe 60 days) if I get bored and can't think of anything else to test.
Interesting. Is there any software that makes it really easy to integrate multi-variate testing instead of just A/B?<p>How many people are really doing A/B testing on conversion rates (meaning assigning a value to a user and giving different trial lengths randomly) vs changing the trial length for fixed intervals?
I feel like there two things that should be A/B test here:
1) Best length of trial to get someone to sign-up for a free trial
2) Best length of trial to get someone to convert to paid user.
Your actual conversion is going to be a combination of those two factors.
A 30 day trial period with user engagement at the end of trial asking them if they need additional time to evaluate will work better. Offer them 15 day extension if they need.<p>There are two advantages to this approach:
1. engage in conversation with user asking for feedback and understand their needs.
2. treat them special with an offer to extend trial, earning their respect for personal attention and better chances of conversion.
Wise to test different trial expiration scenarios against each other, some based on usage time, some based on product feature usage, some a combination of the two. You'll often be surprised which scenario is optimal and by how much.