Readit News logoReadit News
QueensGambit commented on Hypernetworks: Neural Networks for Hierarchical Data   blog.sturdystatistics.com... · Posted by u/mkmccjr
QueensGambit · 5 days ago
Factorization is key here. It separates dataset-level structure from observation-level computation so the model doesn't waste capacity rediscovering structure.

I've been arguing the same for code generation. LLMs flatten parse trees into token sequences, then burn compute reconstructing hierarchy as hidden states. Graph transformers could be a good solution for both: https://manidoraisamy.com/ai-mother-tongue.html

Deleted Comment

u/QueensGambit

KarmaCake day919May 7, 2018
About
“Man, what are you talking about? Me in chains? You may fetter my leg but my will, not even Zeus himself can overpower.”

― Epictetus

View Original