Great idea!<p>Observability (AKA, debug/proxy/statistics/logging/visualization layer) -- for LLM's (AKA Chat AI's)...<p>Hmmm, you know, I would love something for ChatGPT (and other AI chatbots) -- where you could open a second tab or window -- and see (and potentially interact with) debug info and statistics from prompts given to that AI in its main input window, in realtime...<p>Sort of like what Unix's STDERR is for programs running on Unix -- but an "AI STDERR" AKA debug channel, for AI's...<p>I'm guessing (but not knowing) that in the future, there will be standards defined for debug interfaces to AI's, standards defined for the data formats and protocols traversing those interfaces, and standards defined for such things as error, warning, hint, and informational messages...<p>Oh sure, a given AI company could pick a series of their own interfaces, data protocols and how to interpret that data.<p>But if so, that "AI debug interface" -- wouldn't be universal.<p>Of course, on the flip side, if a universal "AI debug interface" were ever established, perhaps such a thing would eventually suffer from the complexities, over-engineering and bloatedness that plague many "designed-by-committee" standards in today's world.<p>So, it will be interesting to see what the future holds...<p>To take an Elon Musk quote and twist it around (basically abuse it! <g>):<p>"Proper engineering of future designed-by-committee standards with respect to AI interfaces and protocols is NOT guaranteed -- but excitement is!"<p>:-) <g> :-)<p>Anyway, with respect to the main subject/article/authors, it's a very interesting and future-thinking idea what you're doing, you're breaking new ground, and I wish you all of the future success with your company, business, product and product ideas!