Deleted Comment
“None of these actions constitutes valid service under the US-UK Mutual Legal Assistance Treaty, United States law or any other proper international legal process.”
https://www.courtlistener.com/docket/71209929/1/4chan-commun...
Just with those two criteria you’re down to, like, six formats at most, of which Protocol Buffers is the most widely used.
And I know the article says no one uses the backwards compatible stuff but that’s bizarre to me – setting up N clients and a server that use protocol buffers to communicate and then being able to add fields to the schema and then deploy the servers and clients in any order is way nicer than it is with some other formats that force you to babysit deployment order.
The reason why protos suck is because remote procedure calls suck, and protos expose that suckage instead of trying to hide it until you trip on it. I hope the people working on protos, and other alternatives, continue to improve them, but they’re not worse than not using them today.
> LLM training & data storage: This study specifically considers the inference and serving energy consumption of an Al prompt. We leave the measurement of Al model training to future work.
This is disappointing, and no analysis is complete without attempting to account for training, including training runs that were never deployed. I’m worried these numbers would be significantly worse and that’s why we don’t have them.
1. In Kenji's article on how to cut an onion, he shows a picture after doing the horizontal cuts; he did five of them, not just one or two. (https://www.seriouseats.com/knife-skills-how-to-slice-and-di...)
2. I'm pretty sure I do more than 10 vertical cuts; there's no easy image in the link above and the video cuts before he does all the vertical cuts, but I think he's doing at least 20.
3. In real life, an onion starts flexing and bending when you cut. With a very sharp knife, I'm sure you do get a bunch of the small pieces which throw off the standard deviation for the "more cuts" method, but a bunch of the small pieces won't actually be cut as a layer of the onion is pushed out of the way instead of a tiny piece cut off.
The reason we have dependency ranges and lockfiles is so that library a1.0 can declare "I need >2.1" and b1.0 can declare "I need >2.3" and when you depend on a1.0 and b1.0, we can do dependency resolution and lock in c2.3 as the dependency for the binary.
Many programming languages have a variant or object type. In C#, any instance of a class will also say that it is of type System.Object. That does nearly make that a type of all types.
There is some nuance and special cases. Like any null is considered a null instance of any nullable object, but you're also not permitted to ask a null value what type it is. It just is a null. Similarly, C# does differentiate between a class and an instance of a class. Both a class and an instance are of a given type, but a class is not an instance of a class.
Presumably the difference is either in one of those nuances, or else in some other axiomatic assertion in the language design that this paper is not making.
Or else I'm very much missing what the author is driving at, which at this time of the morning seems equally possible.