Readit News logoReadit News
lalaithion commented on Typechecking is undecidable when 'type' is a type (1989) [pdf]   dspace.mit.edu/bitstream/... · Posted by u/zem
da_chicken · 10 days ago
This seems strange to me, but it's hours past my bed time and I haven't tried reading lambda calculus or lambda-star calculus theory in about 20 years.

Many programming languages have a variant or object type. In C#, any instance of a class will also say that it is of type System.Object. That does nearly make that a type of all types.

There is some nuance and special cases. Like any null is considered a null instance of any nullable object, but you're also not permitted to ask a null value what type it is. It just is a null. Similarly, C# does differentiate between a class and an instance of a class. Both a class and an instance are of a given type, but a class is not an instance of a class.

Presumably the difference is either in one of those nuances, or else in some other axiomatic assertion in the language design that this paper is not making.

Or else I'm very much missing what the author is driving at, which at this time of the morning seems equally possible.

lalaithion · 10 days ago
System.Object is a type of all objects, NOT a type of all types. That is, all objects are System.Object. If one has a class Foo, then every object of type Foo is also of type System.Object. But what is the type of Foo itself? Not of an instance of Foo, but Foo itself. In the languages this paper is about, the type system can recurse upon itself, allowing types themselves to have type. It’s sort of like C#’s System.Type, except System.Type is a class that holds type information for runtime type inspection, whereas the type of types in this paper is for compile-time generics that can abstract over not just types but the types of types as well.
lalaithion commented on If you're going to vibe code, why not do it in C?   stephenramsay.net/posts/v... · Posted by u/sramsay
lalaithion · 2 months ago
This post mixes up “easy for compilers and assemblers to transform and easy for cpus to execute” with “easy for LLMs to understand” and assumes that anything in the first category must also be in the second category since they’re both computers. In reality, the tools that help humans think are also useful for LLMs.
lalaithion commented on The New AI Consciousness Paper   astralcodexten.com/p/the-... · Posted by u/rbanffy
leumon · 3 months ago
Is there a reason why this text uses "-" as em-dashes "—"?
lalaithion · 3 months ago
Yes; because - is on the keyboard and — isn't. (Don't tell me how to type —, I know how, but despite that it is the reason, which is what the parent comment asks about.)

Deleted Comment

lalaithion commented on 4Chan Lawyer publishes Ofcom correspondence   alecmuffett.com/article/1... · Posted by u/alecmuffett
james_in_the_uk · 4 months ago
What “well established process” would apply here ?
lalaithion · 4 months ago
The US-UK Mutual Legal Assistance treaty imposes obligations on Ofcom which they have not met, 4chan claims:

“None of these actions constitutes valid service under the US-UK Mutual Legal Assistance Treaty, United States law or any other proper international legal process.”

https://www.courtlistener.com/docket/71209929/1/4chan-commun...

lalaithion commented on Protobuffers Are Wrong (2018)   reasonablypolymorphic.com... · Posted by u/b-man
lalaithion · 5 months ago
Protocol buffers suck but so does everything else. Name another serialization declaration format that both (a) defines which changes can be make backwards-compatibly, and (b) has a linter that enforces backwards compatible changes.

Just with those two criteria you’re down to, like, six formats at most, of which Protocol Buffers is the most widely used.

And I know the article says no one uses the backwards compatible stuff but that’s bizarre to me – setting up N clients and a server that use protocol buffers to communicate and then being able to add fields to the schema and then deploy the servers and clients in any order is way nicer than it is with some other formats that force you to babysit deployment order.

The reason why protos suck is because remote procedure calls suck, and protos expose that suckage instead of trying to hide it until you trip on it. I hope the people working on protos, and other alternatives, continue to improve them, but they’re not worse than not using them today.

lalaithion commented on Measuring the environmental impact of AI inference   arstechnica.com/ai/2025/0... · Posted by u/ksec
lalaithion · 6 months ago
They didn’t account for training. From the paper:

> LLM training & data storage: This study specifically considers the inference and serving energy consumption of an Al prompt. We leave the measurement of Al model training to future work.

This is disappointing, and no analysis is complete without attempting to account for training, including training runs that were never deployed. I’m worried these numbers would be significantly worse and that’s why we don’t have them.

lalaithion commented on Dicing an Onion, the Mathematically Optimal Way   pudding.cool/2025/08/onio... · Posted by u/kylecarbs
lalaithion · 6 months ago
A couple nits, since we're already going above and beyond the normal amount of nitpicking:

1. In Kenji's article on how to cut an onion, he shows a picture after doing the horizontal cuts; he did five of them, not just one or two. (https://www.seriouseats.com/knife-skills-how-to-slice-and-di...)

2. I'm pretty sure I do more than 10 vertical cuts; there's no easy image in the link above and the video cuts before he does all the vertical cuts, but I think he's doing at least 20.

3. In real life, an onion starts flexing and bending when you cut. With a very sharp knife, I'm sure you do get a bunch of the small pieces which throw off the standard deviation for the "more cuts" method, but a bunch of the small pieces won't actually be cut as a layer of the onion is pushed out of the way instead of a tiny piece cut off.

lalaithion commented on We shouldn't have needed lockfiles   tonsky.me/blog/lockfiles/... · Posted by u/tobr
lalaithion · 6 months ago
What if your program depends on library a1.0 and library b1.0, and library a1.0 depends on c2.1 and library b1.0 depends on c2.3? Which one do you install in your executable? Choosing one randomly might break the other library. Installing both _might_ work, unless you need to pass a struct defined in library c from a1.0 to b1.0, in which case a1.0 and b1.0 may expect different memory layouts (even if the public interface for the struct is the exact same between versions).

The reason we have dependency ranges and lockfiles is so that library a1.0 can declare "I need >2.1" and b1.0 can declare "I need >2.3" and when you depend on a1.0 and b1.0, we can do dependency resolution and lock in c2.3 as the dependency for the binary.

u/lalaithion

KarmaCake day2412March 9, 2016
About
“to teach if we are called upon, to be taught if we are fortunate”
View Original