Hi HN! I'm Rohit, co-founder and CEO of Portkey. After building our open-source AI gateway that now handles billions of tokens daily, we kept seeing the same problem: prompt engineering is still a dark art. Today we're launching the IDE for Prompt Engineers: Portkey’s Prompt Management Studio (https://portkey.ai/features/prompt-management). Our answer to the prompt engineering chaos.<p>The story behind this is pretty simple. As we worked with teams building serious AI applications, we noticed a pattern - even the best prompt engineers were spending hours on prompt iteration with no systematic way to track what worked. Everyone had their own hacky setup: Notion docs, GitHub gists, local text files. There was no common workflow or tooling.<p>We started by asking a fundamental question: why is prompt engineering so hard? The answer came down to three things: you can't see what's happening inside the model, you can't easily compare across models, and you can't version and deploy things reliably.<p>So we built a tool to solve these problems. You can instantly start at [prompt.new](https://prompt.new) – just like docs.new for Google Docs – and get a powerful playground where you can test across 1600+ models simultaneously. We added real-time performance metrics so you can see exactly what's happening with your prompt.<p>What makes this different is how we've bridged the gap between engineering and deployment. Once you've crafted the perfect prompt, you can version it, label it (staging/prod), and deploy it directly to our high-performance gateway. The entire workflow happens in one place.<p>We've also solved the prompt optimization problem with our AI Prompt Generator. Tell it what you're trying to do, and it'll suggest optimized templates based on provider-specific best practices. This eliminates the "staring at a blank page" problem most prompt engineers face.<p>We support mustache templating and partials for creating sophisticated, reusable prompt architectures. Everything works in sync with our Python and TypeScript SDKs so you can pull these optimized prompts directly into your application code.<p>We've been working with early users who've seen their prompt development cycles speed up dramatically. We'd love to get your feedback – I'll be hanging out in this thread all day to answer questions and hear your thoughts.