It is unclear that a human thinking about things is going to be an advantage in 10, 20 years. Might be, might not be. In 50 years people will probably be outraged if a human makes an important decision without deferring to an LLM's opinion. I'm quite excited that we seem to be building scaleable superintelligences that can patiently and empathetically explain why people are making stupid political choices and what policy prescriptions would actually get a good outcome based on reading all the available statistical and theoretical literature. Screw people primarily thinking for themselves on that topic, the public has no idea.
> A handful of professors told me they hadn’t noticed any change. Some students have always found old movies to be slow, Lynn Spigel, a professor of screen cultures at Northwestern University, told me. “But the ones who are really dedicated to learning film always were into it, and they still are.”
The article doesn't actually give any evidence attention spans are shortened. Many of the movies you study in film school are genuinely excruciatingly slow and boring, unless you're hyper-motivated. Before mobile phones, you didn't have any choice but to sit through it. Now you have a choice. I suspect that film students 30 years ago, despite having a "full attention span", would also have been entertaining themselves on phones if they'd had them.
I love movies. But I also make liberal use of 2x speed and +5s during interminably long suspense sequences that are literally just someone walking through a dark environment while spooky music plays. It's not that I suffer from a short attention span, it's that there's nothing to pay attention to. There's no virtue in suffering through boredom.
All of Walmart's attempts at this have been focused on making Walmart's bottom line better, which is why every one of them has failed, whereas Apple Pay is making my payment experience better, and why I use it all the time.