TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: AI Proxy with support for multiple providers, caching

3 点作者 ankrgyl超过 1 年前
Hi HN,<p>We&#x27;re excited to open source our AI Proxy which supports Mistral, LLaMa2, OpenAI, Azure, Anthropic, and more through vanilla OpenAI SDKs. The proxy also supports configurable caching, API key management, and load balancing across multiple providers.<p>The proxy code also includes multiple deployment options: Cloudflare workers, Vercel, AWS Lambda, and plain-old Express. We&#x27;re open sourcing this because our customers are starting to run their production workloads through it, and we believe that critical-path-of-production tools should be open source.<p>To play around with a hosted version of the proxy, you can simply set the base URL in your OpenAI libs to <a href="https:&#x2F;&#x2F;braintrustproxy.com&#x2F;v1" rel="nofollow noreferrer">https:&#x2F;&#x2F;braintrustproxy.com&#x2F;v1</a> (more detailed instructions in the repo[1] and our docs[2]).<p>We&#x27;d love your feedback -- on use cases we may not have thought of, models we should support, or other features that would be useful to include in this layer of abstraction.<p>[1]: <a href="https:&#x2F;&#x2F;github.com&#x2F;braintrustdata&#x2F;braintrust-proxy">https:&#x2F;&#x2F;github.com&#x2F;braintrustdata&#x2F;braintrust-proxy</a> [2]: <a href="http:&#x2F;&#x2F;braintrustdata.com&#x2F;docs&#x2F;guides&#x2F;proxy" rel="nofollow noreferrer">http:&#x2F;&#x2F;braintrustdata.com&#x2F;docs&#x2F;guides&#x2F;proxy</a>

暂无评论

暂无评论