Readit News logoReadit News
xenodium · 3 years ago
I've had a positive experience building a ChatGPT shell for Emacs [1]. Not having to context switch between the editor and browser is great. With Emacs being a text paradise, there are all sorts of possible integrations, like babel integration to elisp [2] or SwiftUI [3].

In addition to a shell, functions for inserting GPT responses can be pretty neat too. For example, creating org tables [4].

[1]: https://xenodium.com/chatgpt-shell-available-on-melpa

[2]: https://xenodium.com/images/chatgpt-shell-available-on-melpa...

[3]: https://xenodium.com/images/chatgpt-shell-available-on-melpa...

[4]: https://raw.githubusercontent.com/xenodium/chatgpt-shell/mai...

tikimcfee · 3 years ago
I’ve not seen that SwiftUI code block tool before, I’m in absolute love. Mind linking some of your configurations or some of the tool names?

EDIT- my mistake, the first link points it out as https://github.com/xenodium/ob-swiftui. Thanks for sharing!

xenodium · 3 years ago
Check https://xenodium.com/ob-swiftui-updates with latest changes. I haven't gotten around to updating the README in the project page.
newusertoday · 3 years ago
this requires api key. By any chance you know something that can be used in emacs without the key?
BaculumMeumEst · 3 years ago
there's not really a sane way to communicate openai/chatgpt without the use of an api key
chaxor · 3 years ago
There are plenty (perhaps far too many) tools for doing basically `curl` to OpenAI. Local LLM tools are needed however; and are much better for deploying systems on terabytes of data at fractions of the cost.
simonw · 3 years ago
Yeah, that's on my roadmap for "llm " (hence the name) - I want to be able to use the same tool to execute against local models as well.

Everything that goes through the tool can be logged to SQLite so this should make it easier to build up comparisons of different models.

gdubya · 3 years ago
Does it work with LocalAI [1] if you change the openai.api_base value to http://localhost:8080/ ?

1. https://github.com/go-skynet/LocalAI

dndn1 · 3 years ago
+1 for more llms and local llms
kordlessagain · 3 years ago

  $ curl -s https://news.ycombinator.com | strip-tags | ttok -t 4000 | llm --system 'summary bullet points' -s

anotherpaulg · 3 years ago
I use the aichat [1] command line tool a lot for these sort of ad hoc chats. It takes piped input and has nice configurability for setting up a variety of system prompts ("roles"), etc.

If you want to use GPT-4 to manipulate and edit files in your local file system, you can use my cli tool aider [2]. It’s intended for generating and editing code, but you can use it to chat with GPT-4 to read, edit and write any text files in your local. If the files are under git source control, it will commit the changes as they happen as well.

Here’s a transcript of aider editing the ANSI-escape codes in an asciinema screencast recording, for example[3].

[1] https://github.com/sigoden/aichat

[2] https://github.com/paul-gauthier/aider

[3] https://aider.chat/examples/asciinema.html

H8crilA · 3 years ago
I wonder how many different ways people use to do basic ChatGPT queries.

My preferred method is to run a WhatsApp bot, this way I can easily use the LLM also on my phone. And on a computer I just use WhatsApp web, which I keep running anyways. Also this method natively supports iterated conversations.

That, plus some scripts for repetitive stuff.

just-ok · 3 years ago
If you haven't heard, there's an official iOS app[1], so that's probably a far more efficient/private alternative to a custom bot.

[1]: https://apps.apple.com/us/app/openai-chatgpt/id6448311069

krat0sprakhar · 3 years ago
That's sounds great! Can you share some docs on WhatsApp bot? IIRC, those APIs were only available to businesses and not individuals.
H8crilA · 3 years ago
The OpenAI API is available to everyone. I've spent well over $100 just trying various things out over the past two months. I was not trying to save on it, you can do quite a lot even on $10, just make sure to do some napkin maths before you query som endpoint a lot of times. For example it's a lot easier to spend a lot on Dall-e than it is on GTP-3.5
tester457 · 3 years ago
To have an LLM on my android I prefer to use Termux for this, since whatsapp and their api is a hassle.
ignorantguy · 3 years ago
How do you do this? I am interested to learn more about this. Any documentation would be awesome
H8crilA · 3 years ago
Try a Google search for GitHub projects that do that. Or really any other GPT idea. People are building many copies of everything, so I'm not even going to recommend the one that I'm using because there's probably a better one already :). It's simple code so you can also modify it to your liking.
verdverm · 3 years ago
There's an awesome list for BYOK (bring your own key) projects here: https://github.com/reorx/awesome-chatgpt-api#cli
yewenjie · 3 years ago
Charmbracelet recently developed 'mods' which has some cool ideas around Unix pipes.

https://github.com/charmbracelet/mods

gkfasdfasdf · 3 years ago
How prevalent is GPT-4 api access? I feel like I've been on the waiting list for forever, yet this tool has GPT-4 as the default.
simonw · 3 years ago
GPT-4 isn't the default - it uses gpt-3.5-turbo by default, because that's massively cheaper.

If you want to run against GPT-4 (and your API key has access) you can pass "-4" or "--gpt4" as an option.

CORRECTION: Sorry, I was talking about my "llm" tool - https://github.com/simonw/llm - it looks like "mods" does indeed default to 4: https://github.com/charmbracelet/mods/blob/e6352fdd8487ff8fc...

tbiehn · 3 years ago
I keep plugging my own… yet another API invoker - with parallel queries, templates, and config files written in Golang; https://github.com/tbiehn/thoughtloom Has some interesting examples, but I expect the population of users to be constrained to the 5 of us that enjoy CLI, jq, and writing bash scripts.