Readit News logoReadit News
xzyaoi commented on Groq runs Mixtral 8x7B-32k with 500 T/s   groq.com/... · Posted by u/tin7in
londons_explore · 2 years ago
> more than a single model and a lot of finetunes/high rank LoRAs

I can imagine a way might be found to host a base model and a bunch of LoRA's whilst using barely more ram than the base model alone.

The fine-tuning could perhaps be done in such a way that only perhaps 0.1% of the weights are changed, and for every computation the difference is computed not over the weights, but of the output layer activations.

xzyaoi · 2 years ago
There's also papers for hosting full-parameter fine-tuned models: https://arxiv.org/abs/2312.05215

Disclaimer: I'm one of the authors.

xzyaoi commented on Show HN: Aid – System for quickly installing and running machine learning models   youtube.com/watch?v=18ulW... · Posted by u/xzyaoi
sharemywin · 5 years ago
what's the url to the actual website?
xzyaoi · 5 years ago
https://github.com/autoai-org/aid

I choose YouTube link as I felt it describes better, hopefully:(

xzyaoi commented on RSSHub: Everything Is RSSible   github.com/DIYgod/RSSHub... · Posted by u/hawkoy
pembrook · 6 years ago
Is there a list of available routes somewhere?

I’m trying to create a personal clone of mailbrew to send myself a newsletter of all my favorite sources.

xzyaoi · 6 years ago
It's here. https://docs.rsshub.app/en/

By the way, I've been using this for a while and it is superb.

xzyaoi commented on HashiCorp forbids its software being used in China   hashicorp.com/terms-of-ev... · Posted by u/xzyaoi
stunt · 6 years ago
Weird that there is no official announcement from HashiCorp to explain this decision.
xzyaoi · 6 years ago
It would be great if they could explain this decision in more detail, but I guess they are still working on to explain this better and clearer.

u/xzyaoi

KarmaCake day49October 1, 2018View Original