I use ChatGPT regularly for a lot of different tasks. For example, coding, health Q&A, and summarizing docs. The different prompts stack up in the sidebar which becomes very difficult to manage. For example, I frequently have to refer back to a prompt that I wrote previously. But I usually give up looking for it because of the tedious scroll and search process. I was wondering if there is an easier way. How do you manage your prompts in ChatGPT?
This is actually one of my biggest issues with ChatGPT, that it's not really possible to create some kind of reusable workflows. The best option is Custom GPTs, to create a specific chat bot for one task.<p>There are many UI projects for LLMs, like openwebui.com for example. But even with the OpenAI API as backend they don't provide as many features as ChatGPT (Web search, Python processing of data, charting, image generation).<p>I think one of the most promising approach would be some kind of user scripts for extending the official ChatGPT UI. (user scripts in the browser with some tool like Violentmonkey, FireMonkey, or anything similar to the good old Greasemonkey). I don't use it though, and I don't know if there are any good extensions for ChatGPT.
Why not just ask the bot to do something? I'm using it daily and don't have to spend more than 5 seconds thinking of a "prompt".<p>The only exception is function calling (or whatever they call structured output these days), but that is simply embedded in my or other people's programs that call the API.
Simple. If you're using an Apple device, go to Settings → Keyboard → Text Replacement → Add Your Prompt.<p>For example,<p>Replace: !rw<p>With: "Rewrite this using simple words: {your_content}"<p>So, whenever I type "!rw", it replaces the text with "Rewrite this using simple words: {your_content}".<p>You don't need to switch between multiple tabs, use an extension, or refer to the documentation again and again.
I humbly submit my project as a possible solution.<p>It's open-source on Github.<p><a href="https://github.com/danielmiessler/fabric">https://github.com/danielmiessler/fabric</a><p>The tagline is Human Augmentation Using AI, but really it's a crowd-sourced library of prompts.<p>Basically, I solve a problem once, to a satisfactory level, and then I upload it to Fabric so everyone else can do the same.<p>Over 22K stars just since January 2024.
I do a frequent data-export and then keep that text open in an adjacent tab to search.<p>The feature is kinda hidden: preferences (upper-right corner) > Settings > Data controls > Export data. You then get email and download it from that. Unzip and open chat.html.
I use gpt through the api hooked up to a Telegram bot<p>I save common prompts with<p><pre><code> /create blahbot you always blah
</code></pre>
And then I can ask any bot something by saying<p><pre><code> /blahbot tell me about blah
</code></pre>
Kind of hacky and misses some of the qol features built into chatgpt but it's super convenient as I use telegram a lot on phone and desktop anyway and it's got pretty good search functionality (and cheaper to pay per token than the flat fee for me plus friends and family can use it too)
I use Claude and Cursor because they’re so much better and don’t require any fancy prompting to not completely suck.<p>For snippets, I use the system built into Raycast.
For non-programming questions, I just ask Perplexity as I’d ask a person and its orders of magnitude better than Google or any single LLM.
I have trello cards where I paste a link to the chatgpt4 session so I can reuse the context weeks or months later without having to spend time searching for it.
It's a tedious issue that I face everyday too. I solved for myself by creating a single prompt for different tasks.<p>What I do is create a few prompts that will be enough for my everyday tasks. Like if you do coding on a daily basis, create a single prompt that will make ChatGPT understand your coding requirements.<p>Open that single prompt for coding, anyday. No need to scroll up and down (which is very frustrating btw).<p>Hope this helps.
Well somebody could integrate some form of Algolia/Typesense style search function into their own chat GUI, but I don't think that's a very defensible endeavour in itself - it would take OpenAI or Perplexity hours to integrate such functionality into their GUI.
I have used <a href="https://github.com/lastmile-ai/aiconfig">https://github.com/lastmile-ai/aiconfig</a> to encode some of my more common prompts into application like things. You can encode them into a yaml/json file.
Would be nice if you ask the same question twice, it would look in previous "stored questions" or not for your previous answer if you need to reuse an old answer. I found it tedious to look for an old answer I need in old conversations.
I'm using Raycast, not ChatGPT, but I think you can simply create custom GPTs for each use case (with prompt as a system instruction) and then use @ in the default new chat to quickly switch to the desired bot.
Sounds like you’d do better using a front end like silly tavern or something.<p>Designed for role playing but you could easily create multiple characters. Ie one summarisation character etc
I built my own prompt management system. It’s similar to a cms except there’s connections and taxonomic relationships between prompts and generated replies. Haven’t seen anything like it yet. Too busy to put up online for others to use.
I tried it when it was new and found it impressive and cute as a gimmick and intriguing from a technical/scientific point of view. Since then I haven't found any uses for it and stopped using it altogether.