TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How to send context when you use LLMs via API for a B2B use-case?

2 pointsby msnkarthik10 months ago
In B2B use-case, we pass on the data to LLMs via API and get the answers and display to the users. It's like a wrapper on top of LLM API. But since API calls are one-off, there dont have much context about a whole chat. How do you send so much of context when communicating with GPT via an API and not directly through chat. Wondering this for a B2B saas use-case.

1 comment

fdarkaou10 months ago
most LLMs today have a large context window where you can send a history of your chat<p>i&#x27;ve built multiple chat demo apps (see anotherwrapper.com) and there what i basically did was store a full copy of the history in the DB &amp; then in a config file i specified how many previous messages i want to include in my chat history when interacting to the API
评论 #41106837 未加载