TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: Cai – The fastest CLI tool for prompting LLMs

1 点作者 adius大约 1 年前
While using LLMs I realized 2 points:<p>- I often still prefer Google, because I feel like I can get an answer quicker - I&#x27;d rather ask a smaller LLM a few questions than gpt-4 just one - The latency of LLMs is often enough to lose your momentum or abort the generation<p>So I asked myself how I could built the fastest LLM prompt for the CLI? My best guess is to use the fastest language (Rust ) and the fastest LLM (Mixtral powered by <a href="https:&#x2F;&#x2F;groq.com" rel="nofollow">https:&#x2F;&#x2F;groq.com</a>)<p>And it&#x27;s a game changer for me! At this speed it can replace most Googling, reading man pages, looking stuff up, … I can&#x27;t wait to extend it with more features! =)<p>Do you any ideas how to get it even faster?

暂无评论

暂无评论