It started this day<p>In early 2023, Sacha, then CTO of Le Monde, France's leading news outlet with 300 millions page views and 46 millions of users monthly, faced a critical issue. Despite his pursuit of perfection, he noticed discrepancies in core metrics reported in corporate dashboards, weekly activity reports, and monthly management reports.<p>Key indicators such as Monthly Active Users (MAU), Daily Active Users (DAU), sessions, page views, and retention rates were consistently inaccurate, often understated by a significant double-digit percentage. The analytics tools, costing Le Monde hundreds of thousands of dollars annually, were failing to provide precise data.<p>Gilles, CEO of Letsmeet, a productivity software company, encountered a similar problem. An A/B test showed a 17% improvement with solution B after two months, but once deployed to all users, the improvement dropped to 2%. Investigation revealed that 30% of Letsmeet users, including 70% of core users, were not tracked by the analytics tools.<p>Each in their separate ways, Sacha and Gilles discussed their concerns with industry peers across media, ecommerce, entertainment, and productivity sectors in both France and the US. They discovered a widespread issue: a conservative estimate showed that a quarter of users, representing a third of usage, were missing from all analytics tools.<p>These digital companies relied heavily on data for strategic decisions. Detailed dashboards on retention, customer lifetime, and cohort behavior were standard. A/B testing and conversion funnel analysis drove product decisions. Some even used AI to model user behavior. Despite the inaccuracies, management had no choice but to proceed as if the data were correct, lacking any viable alternative.<p>In the next post, we will answer to this fundamental question : Why are the analytics data so consistently inaccurate?