Anecdotally, I worked for an advisory and reporting company that produced reports and ranked them in a hazily defined index (the company was later bought by another, larger advisory company). The way the sausage was made was not pretty. The research staff consisted largely of recent ivy league graduates with little experience or knowledge of the domain or of statistical methods. Data was largely collected from the web. Initially, collection was done by hand, but we were working to automate as much as possible (though the leadership did not quite appreciate the need to invest in tech and did not understand the dao of building software services). In some cases, the data collected was woefully below the minimum sample size you'd need to be justified in drawing any sort of meaningful conclusions (there wasn't even a sense of what a minimum sample would look like, and not much concern about these things when these issues were brought up to management). I recall researchers telling me that the metrics and ranking were adjusted for important clients, something that smells to me like pay-to-play. The data science team was wholly unqualified. The management across the board was unqualified. The article suggests insider trading practices inside consulting companies. I had a similar suspicion, though not something I can prove, that insider knowledge was being laundered through cooked data and presented as "rigorous, data-driven insight".<p>The impression I got of this space:<p>1. Clients, usually executives within large, billion dollar companies, don't really mind spending what for them are pennies on a subscription from a company that claims to know what's up when they feel they don't. They see a company with certain associated credentials and trust the advice.<p>2. Executives, wishing to justify their own decisions, want to cover their own asses by appealing to "experts". Their employers may come across those reports, see their company ranking highly, and say "Wow, Bob. You've done a great job." It matters little how meaningful the data analysis is. The data analysis is just decorative.<p>3. The ranking in reports become a self-fulfilling prophecy of sorts. If you rank highly in an advisory report on a certain kind of product, and readers of that report believe the ranking, then this can lead to more sales or whatever.<p>To be fair, this isn't some kind of Newtonian mechanics. You can't take a bunch of data and just have all your conclusions fall out as a tidy positions in space and time. I am not making the claim that advice cannot be given or that these things cannot be researched to produce meaningful insight. I also do not claim that there are no companies out there producing valuable research. However, my personal experience does make me wonder. Even if and when the claims in the research are true, the appeals to data can become little more than a game of appearances. Maybe, in relative terms, they're not in much worse shape than academia with its data massaging, p-value hacking scandals, and unreplicated/unreplicable studies.