I’ve been trying to get gpt4 to help with a basic python script to clean up wine messy html, and due to it erasing the history I’m having to paste the entire script with every few prompts so it doesn’t wander off and write something useless.<p>However, if I paste in too much it locks out the text box and tells me to refresh.<p>This resets the history.<p>For a productivity tool it’s kind of funny having to teach myself how to work around complete feature breakage and service degradation for it. Goldldfish memory more might actually be wasting my time!
For me, I expect LLMs to become significantly more useful when enabled with multi-session continuity and access to a persistent store of data personally relevant to me. Of course, these are also things which tend to increase the security, privacy and abuse surfaces of LLMs.<p>What I hope someone comes up with is an open standard which lets me choose when to allow an LLM private, temporary 'merge access' to my personal data (eg emails, documents, calendar, browsing history, online posts across platforms and accounts, chats, etc). If no vendor offers something like this then I'll probably host my own private LLM based on whatever the best available open source project is.
It's pretty much fine, it is still so powerful that I can recreate what I want trivially. If anything this goes to show that you should independently persist any prompts that are important to you!