TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: A lightweight LLM proxy to get structured results from most LLMs

13 点作者 lunarcave3 个月前
Hey HN!<p>After struggling with complex prompt engineering and unreliable parsing, we built L1M, a simple API that lets you extract structured data from unstructured text and images.<p><pre><code> curl -X POST https:&#x2F;&#x2F;api.l1m.io&#x2F;structured \ -H &quot;Content-Type: application&#x2F;json&quot; \ -H &quot;X-Provider-Url: demo&quot; \ -H &quot;X-Provider-Key: demo&quot; \ -H &quot;X-Provider-Model: demo&quot; \ -d &#x27;{ &quot;input&quot;: &quot;A particularly severe crisis in 1907 led Congress to enact the Federal Reserve Act in 1913&quot;, &quot;schema&quot;: { &quot;type&quot;: &quot;object&quot;, &quot;properties&quot;: { &quot;items&quot;: { &quot;type&quot;: &quot;array&quot;, &quot;items&quot;: { &quot;type&quot;: &quot;object&quot;, &quot;properties&quot;: { &quot;name&quot;: { &quot;type&quot;: &quot;string&quot; }, &quot;price&quot;: { &quot;type&quot;: &quot;number&quot; } } } } } } }&#x27; </code></pre> This is actually a component we unbundled from our larger because we think it&#x27;s useful on its own.<p>It&#x27;s fully open source (MIT license) and you can:<p>- Use with text or images - Bring your own model (OpenAI, Anthropic, or any compatible API) - Run locally with Ollama for privacy - Cache responses with customizable TTL<p>The code is at <a href="https:&#x2F;&#x2F;github.com&#x2F;inferablehq&#x2F;l1m">https:&#x2F;&#x2F;github.com&#x2F;inferablehq&#x2F;l1m</a> with SDKs for Node.js, Python, and Go.<p>Would love to hear if this solves a pain point for you!

1 comment

alboaie3 个月前
Looks usefull. Could you explain how it works? you have to chain it after the call from other LLM ?
评论 #43198276 未加载