u/axiosgunnar
KarmaCake day1722November 24, 2019About
View OriginalDevelopper from Riga with focus on the embedded development. Loving husband and a father of three children. He/him.
Dead Comment
Dead Comment
Dead Comment
Dead Comment
Dead Comment
Dead Comment
Dead Comment
Deleted Comment
Dead Comment
Ollama silently (!!!) drops messages if the context window is exceeded (instead of, you know, just erroring? who in the world made this decision).
The workaround until now was to (not use ollama or) make sure to only send a single message. But now they seem to silently truncate single messages as well, instead of erroring! (this explains the sibling comment where a user could not reproduce the results locally).
Use LM Studio, llama.cpp, openrouter or anything else, but stay away from ollama!