Looks like a big pivot on target audience from developers to regular users, at least on the homepage https://ollama.com/ as a product. Before, it was all about the CLI versions of Ollama for devs, now it's not even mentioned. At the bottom of the blog post it says:
> For pure CLI versions of Ollama, standalone downloads are available on Ollama’s GitHub releases page.
Nothing against that, just an observation.
Previously I tested several local LLM apps, and the 2 best ones to me were LM Studio [1] and Msty [2]. Will check this one out for sure.
One missing feature that the ChatGPT desktop app has and I think is a good idea for these local LLM apps is a shortcut to open a new chat anytime (Alt + Space), with a reduced UI. It is great for quick questions.
That feature is available in HugstonOne with a new tab, among other features :)
Edit: Is incredible how unethical are all the other developers with their crappie spam unrelated. Ollama is a great app and pioneer of AI, cudos and my best thanks.
I tried Ollama once but immediately removed it, when I couldn't easily install models that are outside of the models they "support". LM Studio is by far the best tool out there in my humble opinion.
> For pure CLI versions of Ollama, standalone downloads are available on Ollama’s GitHub releases page.
Nothing against that, just an observation.
Previously I tested several local LLM apps, and the 2 best ones to me were LM Studio [1] and Msty [2]. Will check this one out for sure.
One missing feature that the ChatGPT desktop app has and I think is a good idea for these local LLM apps is a shortcut to open a new chat anytime (Alt + Space), with a reduced UI. It is great for quick questions.
[1] https://lmstudio.ai/
[2] https://msty.app/