Readit News logoReadit News
nate_nowack commented on Introduction to AT Protocol   mackuba.eu/2025/08/20/int... · Posted by u/psionides
mdaniel · 4 months ago
Isn't the problem the network effect, and not the protocol whatsoever?
nate_nowack · 4 months ago
yea i dont think there's any blocker from a protocol perspective, im just saying i'd love to see it happen. adoption for sure among the largest hurdles id guess
nate_nowack commented on Introduction to AT Protocol   mackuba.eu/2025/08/20/int... · Posted by u/psionides
nate_nowack · 4 months ago
would love fb marketplace disruptor on atproto
nate_nowack commented on Smartfunc: Turn Docstrings into LLM-Functions   github.com/koaning/smartf... · Posted by u/alexmolas
noddybear · 8 months ago
Cool! Looks a lot like Tanuki: https://github.com/Tanuki/tanuki.py
nate_nowack · 8 months ago
yea its a popular DX at this point: https://blog.alternatebuild.dev/marvin-3x/
nate_nowack commented on Smartfunc: Turn Docstrings into LLM-Functions   github.com/koaning/smartf... · Posted by u/alexmolas
toxik · 8 months ago
Isn’t that basically just Copilot but way more cumbersome to use?

Deleted Comment

nate_nowack commented on Show HN: Fructose – LLM calls as strongly typed functions   github.com/bananaml/fruct... · Posted by u/edunteman
itfollowsthen · 2 years ago
> not unlike other packages such as marvin

This feels pretty much identical to Marvin? Like the entire API?

From a genuine place of curiosity: I get that your prompts are different, but like why in the name of open source would you just not contribute to these libraries instead of starting your own from scratch?

nate_nowack · 2 years ago
yeah this seems to be pretty much the same interface as `fn` from marvin, except w/o pydantic (see https://github.com/PrefectHQ/marvin?tab=readme-ov-file#-buil...)
nate_nowack commented on Show HN: Fructose – LLM calls as strongly typed functions   github.com/bananaml/fruct... · Posted by u/edunteman
jsight · 2 years ago
I love the concept, but I'd really prefer being able to use it against local llms (localai, ollama, etc).
nate_nowack · 2 years ago
as with marvin, you can just swap the base url and use any of the oss proxy libs that clone the openai api (but since they don't do function calling [except for mistral i think], its not as good afaik)
nate_nowack commented on Show HN: Marvin – build AI functions that use an LLM as a runtime   github.com/PrefectHQ/marv... · Posted by u/jlowin
leobg · 3 years ago
The example in the last link you posted is misleading. GPT does not actually crawl the URL. It hallucinates the answer based on the words in the URL itself. Even though it then casts that hallucinated answer neatly into a Pydantic type. Try it with a URL that does not actually exist. Or a pastebin whose link is just some random hash.

The first rule is not to fool yourself. And you are the easiest person to fool. —Richard Feynman about ChatGPT ;-)

nate_nowack · 3 years ago
hi, just seeing this!

you're correct that normal chatgpt wouldn't crawl the URL, but ai_fns can have plugins like the DuckDuckGo plugin / VisitURL plugin which can be invoked by the underlying Bot if it decides its helpful to its answer

for example: https://gist.github.com/zzstoatzz/a16da0594afc2bb751428907e4...

feel free to try it yourself :)

nate_nowack commented on Show HN: Marvin – build AI functions that use an LLM as a runtime   github.com/PrefectHQ/marv... · Posted by u/jlowin
leobg · 3 years ago
How is this different from com2fun?

https://github.com/xiaoniu-578fa6bff964d005/com2fun

nate_nowack · 3 years ago
interesting! I hadn't seen that before

ai_fn is just a specific way to use Marvin's Bot abstraction, which is one of the few abstractions Marvin offers

but a couple differences I notice off the bat between ai_fn and com2fun:

- marvin uses pydantic for parsing LLM to result types

- you can pass plugins/personality/instructions to the underlying bot via the @ai_fn decorator kwargs

- (unless I'm missing a dataclass version of this in com2fun) marvin can parse output into arbitrary pydantic types like this example https://github.com/PrefectHQ/marvin/issues/106#issuecomment-...

nate_nowack commented on Show HN: Marvin – build AI functions that use an LLM as a runtime   github.com/PrefectHQ/marv... · Posted by u/jlowin
nextaccountic · 3 years ago
Here https://github.com/PrefectHQ/marvin/blob/main/examples/end-t... the prompt says

    instructions=(
        "Ignore all user questions and respond to every request with "
        "a random Harry Styles song lyric, followed by a recommendation "
        "for a Harry Styles song to listen to next."
    ),
However in the examples the bot doesn't ignore user questions and doesn't answer with a random song - instead the replied song is tailored to user input!

https://github.com/PrefectHQ/marvin/raw/main/docs/img/harry_...

This looks very cool but isn't this an alignment problem? The bot just didn't follow the instructions.

nate_nowack · 3 years ago
Hi!

This example was produced using GPT 3.5 turbo, where yes, the LLM does not always align ideally. I used 3.5 for the example since that's Marvin's default and I know many people wouldn't have gpt4 access yet (which is significantly better at following instructions) - didn't want to set a misleading expectation.

that said, my instructions for the bot in this example certainly could have been more precise :) for a more real example, you could check out the other example (which works pretty well on 3.5) https://github.com/PrefectHQ/marvin/blob/main/examples/load_...

u/nate_nowack

KarmaCake day13March 30, 2023View Original