TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Portkey's Prompt Engineering Studio – The IDE for Prompt Engineers

2 pointsby roh26it3 months ago
Hi HN! I&#x27;m Rohit, co-founder and CEO of Portkey. After building our open-source AI gateway that now handles billions of tokens daily, we kept seeing the same problem: prompt engineering is still a dark art. Today we&#x27;re launching the IDE for Prompt Engineers: Portkey’s Prompt Management Studio (https:&#x2F;&#x2F;portkey.ai&#x2F;features&#x2F;prompt-management). Our answer to the prompt engineering chaos.<p>The story behind this is pretty simple. As we worked with teams building serious AI applications, we noticed a pattern - even the best prompt engineers were spending hours on prompt iteration with no systematic way to track what worked. Everyone had their own hacky setup: Notion docs, GitHub gists, local text files. There was no common workflow or tooling.<p>We started by asking a fundamental question: why is prompt engineering so hard? The answer came down to three things: you can&#x27;t see what&#x27;s happening inside the model, you can&#x27;t easily compare across models, and you can&#x27;t version and deploy things reliably.<p>So we built a tool to solve these problems. You can instantly start at [prompt.new](https:&#x2F;&#x2F;prompt.new) – just like docs.new for Google Docs – and get a powerful playground where you can test across 1600+ models simultaneously. We added real-time performance metrics so you can see exactly what&#x27;s happening with your prompt.<p>What makes this different is how we&#x27;ve bridged the gap between engineering and deployment. Once you&#x27;ve crafted the perfect prompt, you can version it, label it (staging&#x2F;prod), and deploy it directly to our high-performance gateway. The entire workflow happens in one place.<p>We&#x27;ve also solved the prompt optimization problem with our AI Prompt Generator. Tell it what you&#x27;re trying to do, and it&#x27;ll suggest optimized templates based on provider-specific best practices. This eliminates the &quot;staring at a blank page&quot; problem most prompt engineers face.<p>We support mustache templating and partials for creating sophisticated, reusable prompt architectures. Everything works in sync with our Python and TypeScript SDKs so you can pull these optimized prompts directly into your application code.<p>We&#x27;ve been working with early users who&#x27;ve seen their prompt development cycles speed up dramatically. We&#x27;d love to get your feedback – I&#x27;ll be hanging out in this thread all day to answer questions and hear your thoughts.

no comments

no comments