Readit News logoReadit News
renonce commented on Kimi K2 is a state-of-the-art mixture-of-experts (MoE) language model   twitter.com/Kimi_Moonshot... · Posted by u/c4pt0r
viraptor · 5 months ago
How well separated are experts per domain in a model like that? Specifically, if I'm interested in a programming use only, could we possibly strip it to one or two of them? Or should I assume a much wider spread? (And there would be some overlap anyway from the original root model)
renonce · 5 months ago
My experience is that experts are not separated in any intuitive way. I would be very interested (and surprised) if someone manages to prune a majority of experts in a way that preserves model capabilities in a specific domain but not others.

See https://github.com/peteryuqin/Kimi-K2-Mini, a project that keeps a small portion of experts and layers and keep the model capabilities across multiple domains.

renonce commented on Don't Publish with IEEE (2005)   cr.yp.to/writing/ieee.htm... · Posted by u/stargrave
curiousfab · a year ago
Ctrl+i (Firefox)

Not generally useful to show this by default, because nowadays most pages are dynamically generated and although it's technically easy to implement, the last modified header is typically not set to $now.

renonce · a year ago
In Chrome you can F12 and go to "Network" tab and then refresh the page. Choose the first file in the list (that's the HTML itself) and you will find "Response Headers" in the "Headers" panel, which includes Last-Modified. It's a bit deep, which makes sense as it's rarely useful.
renonce commented on AI tool cuts unexpected deaths in hospital by 26%, Canadian study finds   cbc.ca/news/health/ai-hea... · Posted by u/isaacfrond
renonce · a year ago
> That warning showed the patient's white blood cell count was "really, really high," recalled Bell, the clinical nurse educator for the hospital's general medicine program.

I’m not sure how an alarm for “high white cell count” should have had so much impact. Here in China once the doctor prescribes a finger blood test, we sample finger blood after lining up for 15 minutes, and the result is available within 30 minutes. The patient prints the results from a kiosk and any patient who cares enough about their own health will see the exceptionally high white cell count and request an urgent appointment with the doctor for diagnosis right away. Even in normal cases we usually have the doctor see the report within two hours. Why wait several hours?

> While the nursing team usually checked blood work around noon, the technology flagged incoming results several hours beforehand.

> But in health care, he stressed, these tools have immense potential to combat the staff shortages plaguing Canada's health-care system by supplementing traditional bedside care.

This sounds like the deaths prevented by this tech are caused by delays and staff shortage and what this tech does is to prioritize patients with serious issues? While I appreciate using new tools to cut deaths, it looks like the elephant in the room is staff shortage?

renonce commented on Chrome is entrenching third-party cookies that will mislead users   brave.com/blog/related-we... · Posted by u/NayamAmarshe
thayne · a year ago
And that would be annoying to people who aren't already logged in to a related site.

Also, there is no way to know which related site the user is logged in to, so they would have to prompt for every one of their sites.

renonce · a year ago
> Also, there is no way to know which related site the user is logged in to, so they would have to prompt for every one of their sites.

This is not how it works. The mechanism is about allowing a cluster of websites to choose a single first party domain and have all of them share cookies together, not sharing arbitrary cookie from arbitrary domain, otherwise it would create loopholes in connected components that bring back the downsides of third-party cookies. What you mentioned should be done using SSO.

After thinking about it a bit more, I have a clearer picture of how it should work in my mind:

* All cookies are double-keyed: the primary key is the origin of the top-level page and the secondary key is the origin of the page that sets the cookie, just like how partitioned cookies work right now.

* stackoverflow.com uses a header, meta tag or script to request changing its primary key domain to “stackexchange.com”

* The browser makes a request to https://stackexchange.com/domains.txt and make sure that “stackoverflow.com” is in the list, authorising this first-party domain change

* When the user agrees to the change, the page is reloaded with stackexchange.com as the primary key, thus stackoverflow.com can obtain login details from stackexchange.com via CORS or cross site cookies.

* A side effect is that all cookies and state are lost when switching the first-party domain. Should stackoverflow.com be acquired by a new owner, say x.com and changes its first-party domain to x.com, all cookies on stackoverflow.com are lost and the user will have to login on x.com again, maybe using credentials from stackexchange.com. It’s unfortunate but it works around the issues mentioned in the post in a clean way, avoiding loopholes that transfer cookies by switching the first-party domain frequently.

renonce commented on Chrome is entrenching third-party cookies that will mislead users   brave.com/blog/related-we... · Posted by u/NayamAmarshe
thayne · a year ago
This is a tough situation.

Yes, this can, and will, be abused for tracking users across domains that they don't expect to be related.

But there are also legitimate use cases for this.

For example, consider the stackexchange family of sites. They are clearly related, have a unified branding, etc. but are on separate domains. On Firefox, which blocks third party cookies, I have to log in to each of those domains separately. I can't log in to stackoverflow.com, then go to superuser.com and already be logged in. That is a problem that First party sets would solve.

You can argue that it would be better for those sites to be subdomains of a single unified domain, but when the sites were created there wasn't any compelling reason to need to do that, because third party cookies were still very much alive and kicking. And I can say from experience that migrating an app to a different domain without breaking things for users is a royal pain, and can be very expensive.

I'm not saying that First Party Sets should be accepted as is, but it is attempting to solve real problems. And I think a solution that simultaneously protects users' privacy and maintains a good experience for sites that are legitimately related will be difficult to find, or maybe impossible.

renonce · a year ago
> I can't log in to stackoverflow.com, then go to superuser.com and already be logged in.

I would expect a popup like “This site wants to share cookies with stackexchange.com, press Allow to sign in, press Reject to reject forever or press Ignore to decide later”. Takes a single click to enjoy the benefits of both worlds. The mechanism should make sure that every website has a single “first-party domain” shared across all subsites and that first-party domain must not share cookies with any other site than itself to minimize confusion.

renonce commented on The semantic web is now widely adopted   csvbase.com/blog/13... · Posted by u/todsacerdoti
renonce · a year ago
Looks like a perfect use case for LLM: generate that JSON-LD metadata from HTML via LLM, either by the website owner or by the crawler. If crawlers, website owners doesn’t need to do anything to enter Semantic Web and crawlers specify their own metadata format they want to extract. This promises an appealing future of Web 3.0, not by crypto, defined not by metadata but by LLMs.

u/renonce

KarmaCake day729May 18, 2020View Original