Hey HN, I’m Brad, one of the founders of Superblocks, a programmable cloud IDE for internal tools. This week we launched a deep integration with OpenAI, giving developers a tightly integrated app layer for GPT-4 powered internal tools (components, integrations, permissions, audit logging, SSO, observability etc.).<p>Imagine you want to create an AI chat copilot for your support team that gives them answers to customer questions from your company’s corpus of information. With Superblocks, you can directly query a vector database like Pinecone or Weaviate, feed the results into OpenAI using our integration with a custom prompt, hook the response from OpenAI to a Chat component in the UI and click deploy with git.<p>Fun fact: We did this flow ourselves and demoed it at an OpenAI hackathon a few weeks ago!<p>Our AI app layer helps developers access every OpenAI API:<p><pre><code> - use an intuitive UI on top of the API, eliminating the need to decipher API references and handle hyperparameters
- prompt engineering fields that let you integrate any data using variables in code right within the Prompt and System Instruction fields
- cost optimization via static or dynamic token limits, effectively managing API usage to ensure avoid runaway costs
</code></pre>
5-min video: <a href="https://cdn.superblocks.com/videos/superblocks-build-ai-powered-tools-with-gpt4-04252023.mp4" rel="nofollow">https://cdn.superblocks.com/videos/superblocks-build-ai-powe...</a><p>Would love to hear feedback from the HN community!<p>PS. We hear that lots of developers want to use private LLMs for their sensitive data so they don’t have to send to OpenAIs servers, this is something we’re working towards. We also plan to add more LLMs too