Hi HN! I've spent the last year building The Data Workspace, a tool that makes data analysis accessible to non-technical business users while dramatically reducing the token consumption for LLMs.<p>Most LLM-based data tools simply throw entire datasets at models, resulting in astronomical token usage. Through a combination of zero data exposure and local query validation, I've reduced token consumption by 80% compared to standard approaches.<p>I'd love feedback from you, especially from those who've worked with the cost and complexity of data analysis. Would also appreciate thoughts on our approach to token optimization!<p>Try it out: <a href="https://www.thedataworkspace.com/" rel="nofollow">https://www.thedataworkspace.com/</a>