I don't think Zeta is quite up to windsurf's completion quality/speed.
I get that this would go against their business model, but maybe people would pay for this - it could in theory be the fastest completion since it would run locally.
This makes using computers even harder to explain to people who do not spend their entire day keeping up with the latest developments. They cannot form a mental image or reuse any memory of what will happen next, because it is all context dependent.
On the other end of the spectrum, for power users, dynamically adapting user interfaces can also be quite annoying. One can't type ahead, or use shortcut keys, because one doesn't know what the context will be. Having to wait any positive amount of time for feedback is limiting.
Then again, there are probably tons of places where this is useful. I'm just a bit disappointed that we (as a society) haven't gotten the basics covered: programming still requires text files that can be sent to a matrix printer, and the latency of most applications is increasing instead of decreasing as computers become faster.
My hunch is that typing 5 words could be faster than going through 3 menus and screens. I don't see this as a replacement for normal UI, but as an optional shortcut - if you're happy typing.
The problem is it's hard to come up with better examples than your toy examples of weather and maps. Goodness there are so many travel planning demos. Who actually wants the context switch of a UI popping up mid-typed-sentence? Is a date picker really more convenient than typing "next school break"? Visualizations are interesting -- but that changes the framing from soliciting input to densifying information presentation. Datagrids and charts'll be valuable.
Anyway, it's a space that's still starving for great ideas. Good luck!
Main differentiator here is as-you-type UI customization, and also in this case the UI isn't generative in the sense the options are hard coded - the LLM just chooses between them.
I think we'll see a mix of all of the above in the future as we take full advantage of LLMs.
I guess half the internet?