I use ChatGPT regularly for a lot of different tasks. For example, coding, health Q&A, and summarizing docs. The different prompts stack up in the sidebar which becomes very difficult to manage. For example, I frequently have to refer back to a prompt that I wrote previously. But I usually give up looking for it because of the tedious scroll and search process. I was wondering if there is an easier way. How do you manage your prompts in ChatGPT?
There are many UI projects for LLMs, like openwebui.com for example. But even with the OpenAI API as backend they don't provide as many features as ChatGPT (Web search, Python processing of data, charting, image generation).
I think one of the most promising approach would be some kind of user scripts for extending the official ChatGPT UI. (user scripts in the browser with some tool like Violentmonkey, FireMonkey, or anything similar to the good old Greasemonkey). I don't use it though, and I don't know if there are any good extensions for ChatGPT.
I had success extracting the existing conversation and adding it to a new window, but gave up after trying to modify the ChatGPT UI (i.e. to format what I'd just pasted in so it'd look like the conversation left off where I branched). The UI just kept re-rendering/re-painting itself non-stop, overriding what I changed. I didn't try to push past that. I'm sure I could use JS or something else to massage the UI further, but it didn't seem like a non-trivial task. Maybe something to look into some weekend.
https://github.com/lulzury/got-branch-convo
The only exception is function calling (or whatever they call structured output these days), but that is simply embedded in my or other people's programs that call the API.
I use conversational English for basically every prompt I work with ChatGPT on as a regular person.
For my application, I have prompts that I have stored in source code, but those need to have very consistent, exact inputs and outputs (mostly JSON), so creating a specific prompt is important for those.
For anything human where a human can parse it, regular ChatGPT works perfectly fine!
If you're doing ad-hoc things, yeah, just ask. If not, prompt management saves lots of time.
https://lmstudio.ai/
https://untimelyunicorn.gumroad.com/l/machato
http://msty.app/
https://www.macgpt.com/
Or, of course:
https://llm.datasette.io/
Dead Comment
For example,
Replace: !rw
With: "Rewrite this using simple words: {your_content}"
So, whenever I type "!rw", it replaces the text with "Rewrite this using simple words: {your_content}".
You don't need to switch between multiple tabs, use an extension, or refer to the documentation again and again.
% apt install autokey-gtk
It's open-source on Github.
https://github.com/danielmiessler/fabric
The tagline is Human Augmentation Using AI, but really it's a crowd-sourced library of prompts.
Basically, I solve a problem once, to a satisfactory level, and then I upload it to Fabric so everyone else can do the same.
Over 22K stars just since January 2024.
The feature is kinda hidden: preferences (upper-right corner) > Settings > Data controls > Export data. You then get email and download it from that. Unzip and open chat.html.
I save common prompts with
And then I can ask any bot something by saying Kind of hacky and misses some of the qol features built into chatgpt but it's super convenient as I use telegram a lot on phone and desktop anyway and it's got pretty good search functionality (and cheaper to pay per token than the flat fee for me plus friends and family can use it too)For snippets, I use the system built into Raycast. For non-programming questions, I just ask Perplexity as I’d ask a person and its orders of magnitude better than Google or any single LLM.