I found that "observability platforms" like Datadog, New Relic etc can be really expensive and not that valuable unless you place a centralized "real-time observability pipeline" that's data agonistic between the originating telemetry data sources and the end destination like Datadog, New Relic, Looker, etc.<p>This approach allows you to see the data stream in real-time and then you can reduce, enrich, transform the data and then route it to any end destination like Datadog, New Relic, etc.<p>We did this and saw some pretty significant saving from the observability platform vendors we are using. Not only that the fact we were able to enrich the data made our troubleshooting much faster because we had data ingested now that we can make sense of and act on. We used datable.io who is a startup out of SF accomplish this. We talked with them about a month ago and their service was still in beta but they are allowing us to use it free of charge for now (not sure if they are still doing that or not?). I think the founder is an Ex New Relic guy?