TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: Provide LLMs markdown documentation of libraries and frameworks

6 点作者 vivekkalyan大约 2 个月前
Hey HN!<p>We realised that LLMs are great at generating code for super popular libraries like React. But they kinda suck at using less popular&#x2F;newly released libraries, forcing us to stick to established tools and hindering innovation.<p>There is already a standard for creating documentation for LLMs (llmstxt.org), but in my experience the implementations have not been great so far. `llms.txt` works as a good index of the available pages, but in many cases they link to HTML pages. This is a waste for LLMs to parse through (For example, Hono&#x27;s [best practices](<a href="https:&#x2F;&#x2F;hono.dev&#x2F;docs&#x2F;guides&#x2F;best-practices" rel="nofollow">https:&#x2F;&#x2F;hono.dev&#x2F;docs&#x2F;guides&#x2F;best-practices</a>) page is ~37k tokens, while its core content as Markdown is only ~1k tokens)<p>To help with this, we built Atlas Docs, a service + MCP server that provides LLMs with clean, LLM-friendly documentation for various libraries.<p>On the backend, it scrapes pages from documentation sites, processes them into standardized, clean markdown, and stores them into a database. It uses existing `llms.txt`, standardizes links, and crucially, <i>generates a clean `llms.txt` index</i> if the library doesn&#x27;t provide one.<p>The MCP server exposes the processed documentation via the [Model Context Protocol](<a href="https:&#x2F;&#x2F;modelcontextprotocol.io&#x2F;" rel="nofollow">https:&#x2F;&#x2F;modelcontextprotocol.io&#x2F;</a>). This lets LLMs list search and query the docs and retrieve the relevant pages in clean Markdown format.<p>This gives LLMs the structured, concise context they need to generate better code and in our limited testing so far, it definitely improves success rate of working with less popular libraries.<p>If you’re using LLMs for coding, give it a try! You can find instructions to install it with any client that supports MCPs (Cursor, Windsurf, Cline, Claude Desktop etc).<p>We&#x27;re actively adding more libraries. Let me know what you&#x27;d like to see supported!

暂无评论

暂无评论