Readit News logoReadit News
williamtrask commented on Dispelling misconceptions about RLHF   aerial-toothpaste-34a.not... · Posted by u/fpgaminer
williamtrask · 7 days ago
Nit: the author says that supervised fine tuning is a type of RL, but it is not. RL is about delayed reward. Supervised fine tuning is not in any way about delayed reward.
williamtrask commented on European Capitals Are Heating Up   public.tableau.com/app/pr... · Posted by u/gmays
williamtrask · 8 days ago
hugged to death?
williamtrask commented on Claude Sonnet 4 now supports 1M tokens of context   anthropic.com/news/1m-con... · Posted by u/adocomplete
williamtrask · 12 days ago
Claude is down.

EDIT: for the moment... it supports 0 tokens of context xD

williamtrask commented on GPT-5   openai.com/gpt-5/... · Posted by u/rd
highfrequency · 17 days ago
It is frequently suggested that once one of the AI companies reaches an AGI threshold, they will take off ahead of the rest. It's interesting to note that at least so far, the trend has been the opposite: as time goes on and the models get better, the performance of the different company's gets clustered closer together. Right now GPT-5, Claude Opus, Grok 4, Gemini 2.5 Pro all seem quite good across the board (ie they can all basically solve moderately challenging math and coding problems).

As a user, it feels like the race has never been as close as it is now. Perhaps dumb to extrapolate, but it makes me lean more skeptical about the hard take-off / winner-take-all mental model that has been pushed.

Would be curious to hear the take of a researcher at one of these firms - do you expect the AI offerings across competitors to become more competitive and clustered over the next few years, or less so?

williamtrask · 17 days ago
Breakthroughs usually require a step-function change in data or compute. All the firms have proportional amounts. Next big jump in data is probably private data (either via de-siloing or robotics or both). Next big jump in compute is probably either analog computing or quantum. Until then... here we are.
williamtrask commented on Attention is your scarcest resource (2020)   benkuhn.net/attention/... · Posted by u/jxmorris12
28304283409234 · 24 days ago
Spoken like a true <insert derogatory term>.

I disagree. My paycheck stays what it is when I do not go shopping. Time keeps running out regardless of where my attention goes.

To me it seems that attention most certainly is a separate resource.

And more importantly, regardless of what it is, it is more important. Time is meaningless. My attention is not.

williamtrask · 24 days ago
Well... unless you know how to turn your brain completely off while being awake, you're probably always giving your attention to something. Consequently, attention and time both spend at very similar rates.
williamtrask commented on Microsoft CEO says up to 30% of the company's code was written by AI   techcrunch.com/2025/04/29... · Posted by u/pseudolus
williamtrask · 4 months ago
If this isn’t jumping the shark it’s darn close.

u/williamtrask

KarmaCake day2170July 6, 2015View Original