I’m sure there’s exploits that could be embedded into a model that make running locally risky as well, but giving remote access to Anthropic, OpenAI, etc just seems foolish.
Anyone having success with local LLMs and browser use?
I was recently sent a link to this recording of a David Bowie & Nine Inch Nails concert, and I got a serious uneasy feeling as if I was on a psychedelic and couldn't quite trust my perception, especially at the 2:00 mark: https://www.youtube.com/watch?v=7Yyx31HPgfs&list=RD7Yyx31HPg...
It turned out that the video was "AI-upscaled" from an original which is really blurry and sometimes has a low frame rate. These are artistic choices, and I think the original, despite being low resolution, captures the intended atmosphere much better: https://www.youtube.com/watch?v=1X6KF1IkkIc&list=RD1X6KF1Ikk...
We have pretty good cameras and lenses now. We don't need AI to "improve" the quality.
Deleted Comment
> Please be measured and critical in your response. I appreciate the enthusiasm, but I highly doubt everything I say is “brilliant” or “astute”, etc.! I prefer objectivity to sycophancy.
Copilot isn't locked to a specific LLM, though. You can select the model from a panel, but I don't think you can plug in your own right now, and the ones you can select might not be SOTA because of that.