For advanced analytics and true sensemaking, much of this might be true. You need a human to really deeply understand data and find what it really means.<p>However, let's apply the pareto principle. 80% of the needs are served by 20% of the effort. I think for most (let's say 80%) of what most (about 80%) of companies really need, a computer can get at least most of the way there, with much less time and effort than a human.<p>For example, sure, you need a human to really understand your web analytics. However, to understand it at a basic level, pull out clusters of interesting patterns, and discover anomalies and outliers that you wouldn't have gone looking for otherwise, the computer can get the average intelligent person (not an analyst) most of the way there, and certainly can provide a great deal of value for generally much less cost than a human analyst on salary.<p>Plus, this frees analysts up for work that truly needs their expertise. As many data folks will tell you, it'd be great if everything they did was using their full skill set to the fullest; but often they spend half their day pulling reports that a computer truly could have done for the requester without any intervention, if designed correctly (not easy, mind).<p>So, it's not about computers replacing humans entirely. It's about reducing waste, finding ways to cover common cases and repeated work easily, and freeing up human minds for what they're really good at and needed for.
I work in business intelligence, and my understanding of "self-service BI" has never been "no developer needed at any point" or even "unskilled knowledge work". I have always approached it as a particular environment. Specifically, BI team must produce the appropriate data lakes and interfaces to support user-driven design and simple aggregations. The BI team would model the data, optimize for analytical speed, apply business logic, improve data quality, etc. They would then create an interface with a "tool box" of dimensions, facts, and aggregations. At that point, the user takes over with the freedom to choose which dimensions and aggregations go into a chart, what type of visualization to use, and how to drill-down or slice-and-dice the data. That is true "self-service BI", and I have to be honest that I haven't encountered sales pitches with level of deceit claimed by the author.
The opinion piece doesn't give much concrete for me to go on, but I'm not sure I agree. Of course terms like "self-service" have many meanings, but I've seen it used successfully by people with a range of data skills. Some examples:<p>1. Previously, static reports delivered weekly are replaced with daily-updated interactive tools in Tableau (or similar), which allow business users to drill down and filter things.<p>2. Previous reports that would be custom SQL reports are replaced with reports built by a business user (or a power Tableau user) - with much quicker turnaround and better results.<p>And of course, Excel, the most widely used self-service BI tool is used successfully every day by almost every business in the world. It's use has it's problems, but overall it's extremely valuable... and it's definitely self-service and definitely BI.
In my experience "expertise in making sense of data" is only one piece of the puzzle, and often not even the most important one.<p>Domain expertise is hugely important at making sense of data. Self-service allows domain experts to quickly look at data themselves. They may have to learn skills in data-sensemaking, but the expert in data will have to learn about the specific domain (often much harder).<p>I'm noticing that more and more people in a variety of fields have at least a passable understanding of how to make sense of data. For quick questions, self-service access to data makes the process much faster with little risk.<p>I've been in organizations that tried to put data behind gatekeepers who would protect users from making mistakes. In those cases, we made a lot more mistakes because not enough analysis was done, or people didn't have access to data.<p>I've been in other organizations where we let everyone look at the data. Sure, some people made mistakes, but we used that as an opportunity to teach.<p>If I had to bet on which type of firm would win, I'd bet on the latter. I'm deeply skeptical of the promises made by BI vendors, but self-service analytics isn't one of them.
When companies want self-service BI, what they really want is, zero-code-required tools that a BI developer/analyst can use to build on-demand standard reports for business users with predefined click-through paths.<p>And yes, software makers exaggerate in their advertising.<p>As compared to the author's other posts, this one seems hurried and maybe a little grumpy. But, still makes an important point.
The most used BI tool out there, Excel, is self service and it has been so for at least 30 years now. SSBI is not about taking people out, it is about empowering them with the right tools. Tableau, for instance, is nothing more than an Excel 2.0 and nobody I know, not even Tableau salesguys, says we will get all our analytics automagically, very far from that. Every department in Corporations is struggling to find people that are good with data, every department. Also, data analytics comes at the very end, again, nobody is saying your data strategy (blending, complex calcs, etc.) will be autogenerated, this would be crazy talk. You should consider SSBI more like what happened to mainframes and PCs, where the key to success was empowering more and more humans.
If A is the set of people who can write SQL and B is the set of people who know how to make inferences from data, self-service BI tools can expand the universe of people who can analyze data at your organization from A∩B to B. That's a big deal!<p>Yes, there are pitfalls if you give people in ~B access to the BI tool. They could make bad decisions based on the data. On the other hand, they are already making bad decisions without the data. In a world where |B| is limited, it can be hard to make the right tradeoffs. You might choose to let people in ~B use the tool but have a policy against sharing the results without a once-over from someone in B.
I work for a company making a "self-service analytics" tool, but we don't sell it as a "magical analytics" tool that does everything for you. Rather, we see it as a "you don't have to involve the IT department" tool for those quick jobs that analysts or other knowledgeable people would ideally like to do themselves, but can't due to lack of tools.<p>To me, what is wrongly called self-service analytics is often really self-service reporting. You are consuming the end result, not the analysis itself.
Many good points. Nevertheless self-service analytics is the only way to give business users what they need in a reasonable time frame. We've already seen what happens when every little change request has to be implemented by IT people -- time to production becomes weeks and months, instead of hours. Perhaps, there should be some kind of framework with reasonable restrictions and simplifications. But that's a very open question. I've yet to see a good example of a reasonably restricted self-service.
Well, most self-service analysts I met, mixed statistical correllations with causation quite blindly. One needs to have a plenty of common sense before making judgements about the data. Otherwise dynamics of floods of Danuba will be considered cause of change in quantity of students in Budapest, or vice versa.
I would agree that most business managers who sponsor the purchase of such tools grossly overestimate their utility and underestimate the cost to get a self service solution in place.
Also served over https thanks to the RapidSSL cert<p><a href="https://www.perceptualedge.com/blog/?p=2467" rel="nofollow">https://www.perceptualedge.com/blog/?p=2467</a>
As much as I like Few's books and have relied on them heavily in my time building SSBI tools he has a tendency to take something I basically agree with and blow it way out of proportion into an all-or-nothing type declaration. Seems he's true to form today again with a basically correct point that many SSBI tool vendors over-promise and mislead with their marketing and turned it into "self service bi is a lie" which it absolutely is not.<p>Self Service BI tools are not intended to take the place of analytic skill any more than they take the place of domain expertise and that has never been the meaning of the term. The promise of SSBI is to reduce the incredible friction domain experts traditionally had to deal with to get their key business questions answered. Yes, your users need to develop other analytic skills to go along with their domain expertise! Turns out most of us have stronger and weaker points and have to learn and evolve our skills to get our jobs done well. Taking on SSBI means exactly that for your users who most likely have at least one of the key skills (domain expertise) already and maybe more.<p>Using an exploratory type ssbi tool is a conversation with your data via an interactive tool. One question leads to another leads to another and if the alternative is having to stop and ask another department to put each follow up question on their backlog the conversation is basically broken and often business users just stop asking and revert to pure gut feel decision making. I think most of the progress made over the past 20 years in BI has been about making this kind of process more agile in the same sense as iterative development. SSBI is part of that. The inversion of analytic process in big data systems is part of that as well. ML can also play a role in that with the right circumstances.<p>What we can do, as BI vendors, is build tools and documentation that guide users who start with only part of the skills they need into learning the rest while using the tools we provide. We can present defaults that guide the user towards visualizations and views that are easier to interpret. We can embed analytic skill building into our applications in tutorials and hints. We can build metadata up as users inform us about the data as they use it rather than requiring them to do it up front in a big-bang DW modeling session. We can inspect the data with simple heuristics to try and hint the user how to use it with the tool or apply better defaults. We can build better cleansing munging and data consolidation tools. We can build tools to let data analyst teams also turn things around more quickly. And yes, perhaps we can try using machine learning to suggest possible avenues for the user to explore which _of course_ must be interpreted by a user with the right skills because ML is going to be wrong a lot of the time and users need to understand false positives. It's all part of the process.<p>Bailing out on SSBI because of marketers being marketers just isn't pragmatic. Better that we just keep evolving our products from both the data science team end and the business user ssbi end.