what's the url to the actual website?
https://github.com/autoai-org/aid
I choose YouTube link as I felt it describes better, hopefully:(
I choose YouTube link as I felt it describes better, hopefully:(
I’m trying to create a personal clone of mailbrew to send myself a newsletter of all my favorite sources.
By the way, I've been using this for a while and it is superb.
I can imagine a way might be found to host a base model and a bunch of LoRA's whilst using barely more ram than the base model alone.
The fine-tuning could perhaps be done in such a way that only perhaps 0.1% of the weights are changed, and for every computation the difference is computed not over the weights, but of the output layer activations.
Disclaimer: I'm one of the authors.