TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How to save an LLM state achieved with a series of prompts?

3 pointsby Exorustalmost 2 years ago
I'd like to load up an LLM in which I've taught it about something by a series of prompts.

2 comments

armchairhackeralmost 2 years ago
I&#x27;m not sure exactly how it&#x27;s formatted (and may differ between GPT&#x2F;LLAMA&#x2F;others), but when an LLM &quot;remembers&quot; your chat history, it&#x27;s actually getting all of the previous prompts and its responses as its input prepended to your next prompt.<p>Something like:<p><pre><code> System: You are a helpful assistant. User: What is the country with the largest population? Assistant: India. User: Ok, what is the next largest? Assistant: China. User: How about the next largest? Assistant: United States. User: And the next? Assistant: </code></pre> and the model outputs<p><pre><code> Indonesia. </code></pre> But you can&#x27;t just put your history in a single prompt because it would be formatted differently. If you&#x27;re using a model like ChatGPT, there may be a way to copy the conversation. Otherwise, you can definitely edit the conversation (including AI responses) in the API. See <a href="https:&#x2F;&#x2F;platform.openai.com&#x2F;playground" rel="nofollow noreferrer">https:&#x2F;&#x2F;platform.openai.com&#x2F;playground</a> for easy access, there&#x27;s probably a command-line tool or alternative frontend which you can give your API key to make it easier.
brucethemoose2almost 2 years ago
The LLM has no memory. What you actually feed it is the entire conversation (truncated up to its input limit) every time you respond.<p>But you are bound to the interface, especially if you are not running a local LLM like Llama.<p>Different models have different prompting syntax.