TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: How to send context when you use LLMs via API for a B2B use-case?

2 点作者 msnkarthik10 个月前
In B2B use-case, we pass on the data to LLMs via API and get the answers and display to the users. It's like a wrapper on top of LLM API. But since API calls are one-off, there dont have much context about a whole chat. How do you send so much of context when communicating with GPT via an API and not directly through chat. Wondering this for a B2B saas use-case.

1 comment

fdarkaou10 个月前
most LLMs today have a large context window where you can send a history of your chat<p>i&#x27;ve built multiple chat demo apps (see anotherwrapper.com) and there what i basically did was store a full copy of the history in the DB &amp; then in a config file i specified how many previous messages i want to include in my chat history when interacting to the API
评论 #41106837 未加载