I've never worked at a company where we were anywhere close to being able to A/B testing on customers. We just always shot from the hip. I think if you are doing that kind of testing may e you've run out of helpful ideas?<p>Talking to real customers and helping them solve real problems is really potent. And you can get more than just the color of a button. You can get the direction your company needs to go for months.<p>I think part of the problem is that science takes too long. It's like waiting for evolution to play out. You're company is at war with everything, entropy, the economy, your competition, the attention span of customers. Do you have time to science your way to success? Probably not. Do you have time to gamble on your intuition? Barely.<p>Collecting data isn't bad per se. But you should always be asking yourself if you are solving the right problems before you waste your time on it.
As painful as it is to admit, I've come to the same conclusion, after wasting quite a lot of time and efforts.<p>As Warren Buffett likes to say - "It's better to be approximately right, than to be precisely wrong"<p>This should be a poster in every company doing any kind of data.
Data is for seeing problems, not for finding solutions.
As by the authors example, you lose 10M in fraud? Good that you monitor that! Otherwise it would be hard to justify spending time on it.
I worked for a large company that purports to be data-driven.<p>I watched that company converge on the blandest, clunkiest, least useful features over and over and over again.<p>Blindly trusting the data without any product vision is just design by committee at scale.
Maybe I’m missing something. The goal of an A/B test is to test a hypothesis. Where that hypothesis comes from is irrelevant. Sure you can waste your time testing stupid hypotheses that don’t have a lot of business impact, but that’s beside the point.
I don't understand. If you don't already have a thing that you're trying to do, then what cause would you have to collect and analyze data in the first place? Did these people hit their head and forget something important?
The value is in the negative space, most commonly, the rejection of an hypothesis.<p>That is to say: Data can certainly advise you what <i>not</i> to do. Such as flying the ship into that spooky nebula, Captain
> Intuition is underrated<p>> Spend time where your customers are and make your own conclusions.<p>This is a great article, very well-written, and I enjoyed reading it. However, could intuition and spending time with customers be considered another way of collecting data points to inform data-driven decisions?"
“ Their research shows that a nonlinear approach drawing from anthropology, sociology, philosophy, and psychology, is better at getting to the moment of clarity”<p>And all of these come from data. There is no non linearity here; just widening of perceptive funnel.
I think data driven is orhtogonal to opportunity driven and vision driven. With the first, data is used to find opportunities, with the second, it shows you if your strategy to your vision is working or not.<p>Then there is politics driven development. You want to do something and search the trove of data for data that supports what you are doing. Or you look at the data in a strategy meeting, and then ignore it (seen this happen mostly in board meetings of large companies)
I've worked in situations where A/B testing is heavily used (and useful) and in situations where it was completely useless. Both in the same company but at separate points in the flow.<p>Where you have enough users (maybe you're a major online retailer), A/B testing should be a vital part of your toolkit. Not the only tool, but you definitely need to test every change you make. If you can gather enough data within 24 hours, why wouldn't you test your change?<p>That being said, A/B testing isn't the be-all and end-all. It just gives you some information to make a decision. You still need to know your customer, speak to them, survey, observe, etc. You might even pick a "losing" variation with the aim being to reach a more optimal business outcome. Data doesn't give you the right to abdicate your responsibility to make good decisions.<p>There are cases where A/B testing can't help at all. A great example is in low-volume but critical flows (think SaaS conversion funnels). For these, you need to rely on the other skills you have at your disposal.
Another point to support the same idea is data can be falacious.<p>For instance, you are making a pretty advanced 3D web app, and notice in your analytics that your userbase is only chrome and safari users.<p>An easy conclusion is to focus your testing on those two platforms, or maybe even drop support entirely for Firefox and Edge by using some webkit specific API.<p>A not so easy conclusion is the experience might be so bad or buggy on a non-webkit browser that anyone who tries the app in those just gives up on it.<p>The reasonable truth in this case? You should use standard browser distribution except if you're operating in a specific market, it might also be perfectly fine to drop non webkit browser if the ROI of developing them is not worth it for your goal etc. All of which does not need data but rather intuition and common sense.
I love this piece. It evokes the well-known idea that "we shape our tools, and thereafter our tools shape us," drawing attention to the concept that digitally represented information serves as an observer's or machine's recorded testimony of a physical or cognitive system.<p>What is the value of that?<p>Language alone, or in this case, information, does not dictate our actions. However, there is persuasive power inherent in language — specifically, language that exposes the subjective gains individuals aim to achieve through their actions, often influencing individual behavior.<p>There exists an unexplored connection between our contemporary understanding of data and praxeology.
I'm a bit torn on this. I think basing product decisions on analytics is a bad idea, because the numbers can only tell you about features your software already has.<p>But analytics/diagnostics are extremely important to discover bugs, because you can't rely on customers to tell you about them.
Data _can_ tell you what to do, but that doesn't mean your data selection and gathering was precise, right, and aligned to your actual best interests.<p>Data-informed decisionmaking is great. Data-driven decisionmaking, not so much. You still need to trust your gut.
Good insights. Before the term was bastardized, then consumed, by Machine Learning, the experimentation component was considered the killer delivery by data scientists (putting the _science_ in the term). Now, most folks assume MLE := DS, rather than MLE ⊂ DS
I love this article; it's well written and accurate. But in my experience the bigger problem is that companies aren't using data <i>at all</i>.<p>Data-ignorant decision is a killer, too.
A/B testing is valuable to optimise an existing process once the low-hanging fruit has been tackled, but it should also be discontinued when the return on investment becomes negligible. Because of organisational momentum, this rarely happens, as a team would make themselves redundant. This can result in A/B testing noise.<p>Data also provides valuable insight in a negative manner. If your conversion rate is abysmal, the data tells you to get out of your cubicle and start talking with real customers to find out what the data isn't telling you. It is still a data-driven decision. It is just a negative one.<p>However, in the end, data isn't going to find your next billionaire dollar opportunity. You need to find a gap in the market that no one has tackled before, and of course there is no data for, otherwise someone else would have jumped on it.
Even if you can tell with absolute knowledge that changing the graphics of that packaging increases conversion (removing the disgusting graphics of cancer patients), perhaps we shouldn't be selling more cigarettes?<p>Data informs your values, but your values are a choice. Already from the beginning, data will never define your values. Even with perfect knowledge, your decisions are still going to be a choice. Combine that with the fact that our knowledge is imperfect (our data is incomplete, biased, a single and partial perspective)...<p>"A company should seek to maximize its profits," is a normative statement, not a truth. It is a choice of values.