Readit News logoReadit News
chewxy commented on Do Large Language Models know who did what to whom?   arxiv.org/abs/2504.16884... · Posted by u/badmonster
kazinator · 4 months ago
Of course they can do it, if they are trained with a large number of pairs of data consisting of various texts, and annotations of who does what in that text. Then they will predict correct tokens that talk about who did what.

LLMs are pretty good at preserving who did what when they translate from one language to another. That's because translation examples they are trained on correctly preserve who did what.

chewxy · 4 months ago
Maybe read the paper first?

> This study asked whether Large Language Models (LLMs) understand sentences in the minimal sense of representing “who did what to whom”. In Experiment 1, we found that the overall geometry of LLM distributed activity patterns failed to capture this information: similaritiesbetween sentences reflected whether they shared syntax more than whether they shared thematic role assignments. Human judgments, in contrast, were strongly driven by this aspect of meaning.

> In Experiment 2, we found limited evidence that thematic role information was available even in a subset of hidden units. Whereas activity patterns in subsets of hidden units often allowed for significant classification of whether sentence pairs had shared vs. opposite thematic role assignments, the effect sizes were small; even the best-performing case appeared to lag behind humans, and its representation of thematic roles did not seem robust across syntactic structures.

> However, thematic role information was reliably available in a large number of attention heads, demonstrating LLMs have the capacity to extract thematic role information. In some cases, information present in attention heads descriptively exceeded human performance.

chewxy commented on Are Plants Farming Us?   inleo.io/@gentleshaid/are... · Posted by u/signa11
chewxy · 5 months ago
I wrote a short story ten years ago making fun of this concept: https://blog.chewxy.com/2014/05/20/the-long-term-plan/
chewxy commented on Tree Calculus   treecalcul.us/... · Posted by u/iamwil
chewxy · 9 months ago
Barry Jay's got an upcoming paper at PEPM regarding typed tree calculus. Good read too.
chewxy commented on Ask HN: What are you working on (September 2024)?    · Posted by u/david927
bidder33 · a year ago
good title!
chewxy · a year ago
Thanks :)
chewxy commented on Ask HN: What are you working on (September 2024)?    · Posted by u/david927
wslh · a year ago
What is your GTM strategy?
chewxy · a year ago
At this point I'm writing mostly for myself. GTM strategies for novels... that's an interesting way to think about things. I've not thought about it just yet. Happy to hear if you have any ideas tho.
chewxy commented on Ask HN: What are you working on (September 2024)?    · Posted by u/david927
chewxy · a year ago
I'm working on my scifi novel. I had started writing it when LLMs started taking off - I had been doing AI for two decades and I was well-placed to be in a good position to profit with the rise of LLMs, but I ended up gaining nothing much and I was depressed about it - so I started writing instead. Been picking at it for about a year before befriending an editor who encouraged me to keep writing. He's helped me developmentally edit it to a point I am now ready to work on my second draft.

It's a hard scifi novel with mild existential horror tones that is borne mostly of maths jokes. At one point the main character tries to escape the matrix (reality). But the matrix is defective, so the best way out was to orthogonalize the subspace and reduce the matrix to its eigenbasis instead. Most of the scenes are based on similar maths jokes.

Tentative name is Diagonalization of the Meta (I had previously called it The Metaverse).

chewxy commented on IOGraphica   iographica.com/... · Posted by u/bookofjoe
nyrulez · a year ago
Weird lack of examples. I was curious but I am not going to download before I have some idea of what I am getting into.
chewxy commented on APL Demonstration (1975) [video]   youtube.com/watch?v=_DTpQ... · Posted by u/nequo
dbcurtis · a year ago
A friend once described APL as a “write-only language”. You make his case well :)

It is pretty easy to write unmaintainable APL, it seems to me.

chewxy · a year ago
Not really, this is actually pretty readable
chewxy commented on The case for not sanitising fairy tales   plough.com/en/topics/cult... · Posted by u/crapvoter
sltkr · a year ago
It's interesting that the article mentions Hans Christian Andersen's “The Little Mermaid” as an example of a story that was “sanitized” by removing the part where Ariel is forced to choose between killing her prince or turning into foam on the waves.

But Andersen's story was itself a sanitized version of Friedrich de la Motte Fouqué's “Undine”, a fairy/morality tale in which a water spirit marries a human knight in order to gain an immortal soul. In that story, her husband ultimately breaks his wedding vows, forcing Undine to kill him, and losing her chance of going to heaven.

Andersen explicitly wrote that he found that ending too depressing, which is why he made up his whole bit about Ariel refusing to kill Prince Erik, and instead of dying, she turned into a spirit of the air, where if she does good deeds for 300 years, she's eventually allowed to go to heaven after all.

Even as a child, it felt like a cop-out to me. But my point was: “The Little Mermaid” is itself a sanitized version of the original novella, adapted to the author's modern sensibilities.

chewxy · a year ago
I told a variant of the original Little Mermaid story as part of a school outreach program. The kids came to the conclusion that God wasn't a fair being because he didn't give mermaids souls. I walked away satisfied that my little counterprogramming against catholic school indoctrination might have worked. I wasn't invited back (at least for school year 2024).

u/chewxy

KarmaCake day5011November 3, 2010
About
You can contact me here: chewxy [at] gmail dot com

My personal blog is http://blog.chewxy.com . Be warned. Lots of nonsense in there.

View Original