You need to enable JavaScript to run this app.
Readit News
Posted by
u/annjose
9 months ago
How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)
annjose.com/post/mobile-o...
incomingpain
·
9 months ago
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.