Readit News logoReadit News
kanestreet commented on Jan – Ollama alternative with local UI   github.com/menloresearch/... · Posted by u/maxloh
simonw · 6 months ago
Huh, yeah it looks like the GUI component is closed source. Their GitHub version only has the CLI.
kanestreet · 6 months ago
Yup. They have not even acknowledged the fact that it’s closed, despite a ton of questions. Ppl are downloading it assuming it’s open source only to get a nasty surprise. No mention of it in their blog post announcing the GUI. Plus no new license for it. And no privacy policy. Feels deceptive.
kanestreet commented on Ollama's new app   ollama.com/blog/new-app... · Posted by u/BUFU
prophesi · 6 months ago
I think having a bash script as the linux installation is more of a stop-gap measure than truly supporting Linux. And ollama is FOSS compared to LM Studio and Msty (as someone who switched from ollama to LM Studio; I'm very happy to see the frontend development of ollama and an easier method of increasing the context length of a model).
kanestreet · 6 months ago
lol the Ollama app is closed source
kanestreet commented on Ollama's new app   ollama.com/blog/new-app... · Posted by u/BUFU
underlines · 6 months ago
Heads up, there’s a fair bit of pushback (justified or not) on r/LocalLLaMA about Ollama’s tactics:

    Vendor lock-in: AFAIK it now uses a proprietary llama.cpp fork and builts its own registry on ollama.com in a kind of docker way (I heard docker ppl are actually behind ollama) and it's a bit difficult to reuse model binaries with other inference engines due to their use of hashed filenames on disk etc.

    Closed-source tweaks: Many llama.cpp improvements haven’t been upstreamed or credited, raising GPL concerns. They since switched to their own inference backend.

    Mixed performance: Same models often run slower or give worse outputs than plain llama.cpp. Tradeoff for convenience - I know.

    Opaque model naming: Rebrands or filters community models without transparency, biggest fail was calling the smaller Deepseek-R1 distills just "Deepseek-R1" adding to a massive confusion on social media and from "AI Content Creators", that you can run "THE" DeepSeek-R1 on any potato.

    Difficult to change Context Window default: Using Ollama as a backend, it is difficult to change default context window size on the fly, leading to hallucinations and endless circles on output, especially for Agents / Thinking models.
---

If you want better, (in some cases more open) alternatives:

    llama.cpp: Battle-tested C++ engine with minimal deps and faster with many optimizations

    ik_llama.cpp: High-perf fork, even faster than default llama.cpp

    llama-swap: YAML-driven model swapping for your endpoint.

    LM Studio: GUI for any GGUF model—no proprietary formats with all llama.cpp optimizations available in a GUI

    Open WebUI: Front-end that plugs into llama.cpp, ollama, MPT, etc.

kanestreet · 6 months ago
“I heard docker people are behind Ollama” um yes it’s founded by ex docker people and has raised multiple rounds of VC funding. The writing is on the wall - this is not some virtuous community project, it’s a profit driven startup and at the end of the day that is what they are optimizing for.
kanestreet commented on Ollama's new app   ollama.com/blog/new-app... · Posted by u/BUFU
Eisenstein · 6 months ago
Sure, those are all difficult problems. Problems that single devs are dealing with every day and figuring out. Why is it so hard for Ollama?

What seems to be true is that Ollama wants to be a solution that drives the narrative and wants to choose for its users rather than with them. It uses a proprietary model library, it built itself on llama.cpp and didn't upstream its changes, it converted the standard gguf model weights into some unusable file type that only worked with itself, etc.

Sorry but I don't buy it. These are not intractable problems to deal with. These are excuses by former docker creators looking to destroy another ecosystem by attempting to coopt it for their own gain.

kanestreet · 6 months ago
^^^ absolutely spot on. There’s a big element of deception going on. I could respect it (and would trust the product more) if they were upfront about their motives and said “yes we are a venture backed startup and we have profit aspirations, but here’s XYZ thing we can promise. Instead it’s all smoke and mirrors … super sus.
kanestreet commented on Ollama's new app   ollama.com/blog/new-app... · Posted by u/BUFU
mchiang · 6 months ago
Ben, we've had private conversations about this previously. I don't see any VC money grab nor am I aware of any.

Building a product that we've dreamed of building is not wrong. Making money does not need to be evil. I, and the folks who worked tirelessly to make Ollama better will continue to build our dreams.

kanestreet · 6 months ago
I mean you’re a YC backed startup soo it’s not like it’s out of the question lol

u/kanestreet

KarmaCake day6July 31, 2025View Original