Edit: I wonder if the vim community can contribute to a feature bounty like this? Hmm
Edit: I wonder if the vim community can contribute to a feature bounty like this? Hmm
* interpret generously
Yudkowsky lacks credentials and MIRI and its adjacents have proven to be incestuous organizations when it comes to the rationalist cottage industry, one that has a serious problem with sexual abuse and literal cults.
It's interesting how often fermi estimation problems are used as proxy's for "intelligence". Something like: 'let's assess how well "they can think" - how many golf balls fit in a baseball stadium?' etc.
Often, doing well in these kinds of problems can more than makeup for a lack of specific knowledge in something someone is interested in assessing!
Deleted Comment
If humanity survived far into the future, could we plausibly develop ways to slow or even halt the decay of the universe? Or is this an immutable characteristic of our universe, meaning humanity will inevitably fizzle out along with the universe?
This sounds more like an automation of that idea than just N-times the work.