Readit News logoReadit News
kyledrake · 6 months ago
This interface needs to have a better relationship with streaming, there is always a lag in response and a lot of people are going to want to stream the response in non blocking threads instead of hanging the process waiting for the response. Its possible this is just a documentation issue, but either way streaming is a first class citizen on anything that takes more than a couple seconds to finish and uses IO.

Aside from that the DSL is quite excellent.

bradgessler · 6 months ago
There’s a whole world of async IO in Ruby that doesn’t get enough attention.

Checkout the async gem, including async-http, async-websockets, and the Falcon web server.

https://github.com/socketry/falcon

earcar · 6 months ago
Thank you for your kind words!

Valid point. I'm actually already working on testing better streaming using async-http-faraday, which configures the default adapter to use async_http with falcon and async-job instead of thread-based approaches like puma and SolidQueue. This should significantly improve resource efficiency for AI workloads in Ruby - something I'm not aware is implemented by other major Ruby LLM libraries. The current approach with blocks is idiomatic Ruby, but the upcoming async support will make the library even better for production use cases. Stay tuned!

joevandyk · 6 months ago
From https://rubyllm.com/#have-great-conversations

    # Stream responses in real-time
    chat.ask "Tell me a story about a Ruby programmer" do |chunk|
      print chunk.content
    end

jupp0r · 6 months ago
This will synchronously block until ‘chat.ask’ returns though. Be prepared to be paying for the memory of your whole app tens/low hundreds of MB of memory being held alive doing nothing (other than handling new chunks) until whatever streaming API this is using under the hood is finished streaming.
kyledrake · 6 months ago
That looks good, I didn't see that earlier.
jatins · 6 months ago
Such a breath of fresh air compared to poor DX libraries like langchain
nullpoint420 · 6 months ago
I’ve found the Ruby community really cares about DUX. Not sure why it’s not in other language communities
toasterlovin · 6 months ago
I don’t really mean this to be derogatory toward people who enjoy other things, but Ruby is a language and ecosystem by and for people who have taste.
choxi · 6 months ago
Matz said he designed Ruby to optimize for developer happiness, it’s just a core principle of the language since it was created
RangerScience · 6 months ago
Every language prioritizes something (or somethings) because every language was made by a person (or people) with a reason; python and correctness; Java and splitting up work; Go and something like "simplicity" (not that these are the only priorities for each language). As another comment points out, Matz prioritized developer happiness.

My favorite example of this is the amazing useful and amazing whack Ruby array arithmetic; subtraction (`arr1 - arr2`) is element-wise removal, but addition (`arr1 + arr2`) is a simple append. These are almost always exactly what you want to do when you reach for them, but they're completely "incorrect" mathematically.

rochak · 6 months ago
Umm, doesn’t Go do so as well? Personally, I’ve had a better experience working with Go tooling.
jasongill · 6 months ago
I was an early contributor to Langchain and it was great at first - keep in mind, that's before chat models even existed, not to mention tools, JSON mode, etc.

Langchain really, I think, pushed the LLM makers forward toward adding those features but unfortunately it got left in the dust and became somewhat of a zombie. Simultaneously, the foundational LLM providers kept adding things to turn them more into a walled garden, where you no longer needed to connect multiple things (like scraping websites with one tool, feeding that into the LLM, then storing in a vector datastore - now that's all built in).

I think Langchain has tried to pivot (more than once perhaps) but had they not taken investor $$ early on (and good for them) I suspect that it would have just dried up and the core team would have gone on to work at OpenAI, Anthropic, etc.

ekianjo · 6 months ago
langchain and llamaindex are such garbage libraries: not only they never document half of the features they have, but they keep breaking their APIs from one version to the next.
brokegrammer · 6 months ago
I was about to mention those. I decided a while ago to build everything myself instead of relying on these libraries. We could use a PythonLLM over here because it seems like nobody cares about developer experience in the Python space.
earcar · 6 months ago
Thank you! This is what the Ruby community has always prioritized - developer experience. Making complex things simple and joyful to use isn't just aesthetic preference, it's practical engineering. When your interface matches how developers think about the problem domain, you get fewer bugs and more productivity.
olegp · 6 months ago
Would anyone happen to know of a similar library with as good DX but for JavaScript or TypeScript?
mathgeek · 6 months ago
Perhaps something like https://llmjs.themaximalist.com/

Dead Comment

SkyPuncher · 6 months ago
IMO, the samples look great because they're ridiculously simple.

It doesn't deal with any of the hard problems you'll routine face with implementation.

someothherguyy · 6 months ago
What about it is a breath of fresh air? What do the other libraries do that this doesn't?
gregmolnar · 6 months ago
Be careful with the examples though: https://github.com/crmne/ruby_llm/issues/25
earcar · 6 months ago
Thanks for flagging this. The eval was only in the docs and meant only as an example, but we definitely don't want to promote dangerous patterns in the docs. I updated them.
soheil · 6 months ago
bobby drop table, still a thing
ketzo · 6 months ago
Is this gonna be the thing that finally makes me tried Rails? Ruby syntax really is just nice.
drdaeman · 6 months ago
I think it's the very nice-looking and clean high-level API that should be a pleasure to use (when it fits the job, of course).

I'm pretty sure this API semantics (instance builder to configure, and then it's ask/paint/embed with language-native way to handle streaming and declarative tools) would look beautiful and easy to use in many other languages, e.g. I can imagine a similar API - save, of course, for the Rails stuff - in Python, C# or Erlang. While this level of API may be not perfectly sufficient for all possible LLM use cases, it should certainly speed up development time when this level of API is all that's possible needed.

ilrwbwrkhv · 6 months ago
Oh just beautiful. Ruby is so expressive and concise.

If you see the typescript options it's like giving yourself a water boarding session through your own volition.

gedy · 6 months ago
Is it really Ruby or they just made a nice interface? I don't see why a hypothetical TypeScript example would be all that different.

    // Just ask questions
    const chat: Chat = LLM.chat;
    chat.ask("What's the best way to learn TypeScript?");
    
    // Analyze images
    chat.ask("What's in this image?", { image: "ts_conf.jpg" });
    
    // Generate images
    LLM.paint("a sunset over mountains in watercolor style");
    
    // Create vector embeddings
    LLM.embed("TypeScript is powerful and scalable");
    
    // Let AI use your code
    class Calculator {
      description = "Performs calculations";
      params = {
        expression: { type: "string", desc: "Math expression to evaluate" },
      };
    
      execute(args: { expression: string }): string {
        return eval(args.expression).toString();
      }
    }
    
    chat.withTool(new Calculator()).ask("What's 123 * 456?");

williamcotton · 6 months ago
It's the extra parens, semi-colons, keywords and type annotations. Ruby makes the tradeoff for legibility above all else. Yes, you can obviously read the TypeScript, but there's an argument to be made that it takes more effort to scan the syntax as well as to write the code.

Also:

  const chat: Chat = LLM.chat;
...is not instantiating a class, where Ruby is doing so behind the scenes. You'd need yet another pair of parens to make a factory!

This is mainly a matter of syntactic style!

Dead Comment

Dead Comment

freen · 6 months ago
Wow. So thoughtful.

Ruby: late to the party, brought a keg.

wtf242 · 6 months ago
been using https://github.com/alexrudall/ruby-openai for years with no issues which is a fine gem and works great.
strudey · 6 months ago
aw thanks, glad you like it! more good Ruby AI libraries is a good thing IMO
dismalaf · 6 months ago
Ruby has a bunch of tools to interact with LLMs already. Also has had bindings to stuff like Torch and Tensorflow for years.
someothherguyy · 6 months ago
What about it is so thoughtful?
aguynamedben · 6 months ago
Ruby is alive and well!
init0 · 6 months ago
One of the most concise APIs to interact with an llm!

Keep going! Happy to see ollama support PR in draft.