This sort of segmentation exists in many, many free trials. Sometimes the causational arrow can be difficult to tease out. For example, one's best customers often engage higher with the app on every conceivable dimension, so if you happen to test a particular feature, you'll see adoption of that feature correlates with conversions and might be inclined to e.g. try to juice adoption via promoting in onboarding. That works only some of the time.<p>In general, though, strong endorsement of figuring out what people care about quantitatively and qualitatively, instrumenting that, and then testing early in the funnel improvements to it. The gold standard is, of course, not doing retrospective analysis but rather split testing the intervention. I sympathize that this is difficult for a lot of SaaS apps as you need high trial volumes to make it work.<p>I can't share client results but my blog probably has two or three write ups of 10%+ lifts for BCC doing this.
The title is wrong. Ghost didn't increase their conversion rate by 1000%. They found a segment of their user base that converted 1000% more than another segment.<p>It is not included in this post whether the overall conversion rate actually improved substantially after the changes were made.
How do you know that you have the cause and effect the right way around? After all, it could well be that users add a theme <i>because</i> they are about to buy the service, and figure that they should spend some time customising their pages now that they have decided to sink money into it.<p>Ok, so that may not sound totally convincing, but in general I think it is a big problem with the analytics in the article. If you can't tell if A caused B or B caused A, then you can't reliably act on it.
There are "ah ha!" moments, and there are also "oh no!" moments.<p>I love Ghost's simplicity for blogging, but then I spotted an egregious spelling mistake in one of the articles I'd published. I went back to the editor to check where I'd missed the ubiquitous red squiggly line, and found there wasn't one. No squiggly line, and in fact no spellchecker at all!<p>Given that spelling is very important for blogs, and that all browsers have very effective spellcheckers built in, I'm at a loss to understand why this wasn't bug number 1 to the developers. They use a markdown editor that somehow kills browser spellchecking, but there are several alternatives out there and this was a definite "Oh no!" moment for me.
This data intuitively makes sense -- the added theme gives users a sense of "this is <i>my</i> product," where before it was "this is a Ghost product that I'm using right now."<p>As a user, however, I'm annoyed by videos, and hardly ever watch tutorial videos. Why spend a minute and a half watching something when I can read it much faster? Especially if setting up a theme is as easy as you're trying to make it -- shouldn't it be obvious how to do it?
I find it a little suspect that you had exactly 3400 people watch your training video and exactly 6600 not watch the video. Those seem like very convenient statistics. Am I missing something?
This goes to show how important analytics is in a business. They found something they could market better in their own software by understanding their business more. Big ups to Ghost for that discovery.
[Disclaimer: Off topic and reaching for help]<p>Me and my girlfriend are looking for some gigs so we can buy a ticket plane and meet in February (she is in Italy right now while I am in China).<p>She is working through her PhD in Statics while I am a developer with a lot of interest in data analysis and visualization.<p>If somebody need the same analysis they did a ghost it is something we can handle pretty well.<p>If you want to collaborate just drop a line at:
simone (at) mweb (dot) biz<p>[End off topic, and sorry about that]