> state machines with llm directed function calling is going to be a huge unlock
This was my intuition as well, glad you're able to resonate with that :)
> One thing I’m curious about is narrowing the scope of accessible functions based on a state machine that is designed to match the business domain.
This is an interesting question, I can definitely see how state machines can help with narrowing the scope of accessible functions.
We’re all typescript under the hood so I’ll give this a look and see if we can use it.
Symphony wouldn’t support other LLMs currently, right? Only GPT-4?
Right, currently Symphony only supports GPT-4 and GPT-3.5-turbo since they're the only ones with native API support.
IIRC, only the function signatures (or descriptions) are counted as part of the context window so you could add as many till you exceed that limit. Since the contents of the function itself are not counted, your function can be whatever length.
> is there any way to choose only a subject of the functions to share for a given user query?
As of now, no. I can see why this may be a problem soon since right now all functions are available for gpt-4 and each call can become expensive pretty quickly if you send like 50 functions every time.
I'm not sure how to address this yet, but I'd like to think of it as some form of fine-tuning that happens after having a few conversations. Will keep you in the loop!