How would you even know that this was the case?
I think if you're fortunate enough to really, deeply want something, then you should simply train to become good at it. Don't worry about your natural talents, since those will change.
Personal anecdote. I started learning a rigorous dance in late 20s. No fitness or movement or musical background. Programming/sit on my ass background only.
After 10 years of it, when I try something like tai chi now, the teachers pick out that I'm genuinely "gifted" or "talented". Then I tell them I'm a dancer and they'd be like "oh that explains it".
This happened even 5 years into dance training. I had absolutely no talent for it - I always struggled with mysterious problems others never had. Whether it's postural, rhythmic, musical, whatever. Had it all.
My point is, identity change happens much faster than we imagine, when you go all-in. It doesn't take 50 years. But it's also slower than we imagine. It's not 5 months. You have to understand the timelines of human change.
Of course on day 1, week 1, year 1, even year 3, everything sucks. You can't then write an essay saying "here's my lessons from learning journey". I will believe an essay when the author gave his youth to understanding the nature of talent. Not if he gave it 3 years.
On a semi related tangent, I recently listened to the audio book of Ajahn Brahm's Mindfulness, Bliss and Beyond. It was pleasantly surprising to hear nimitta spoken about so frequently outside of the Visuddhimagga!
Ingesting Buddhist commentaries and practice manuals to provide advice and help with meditation is one of the few LLM applications that excite me. I was impressed when I received LLM instructions on how an upāsaka can achieve upacāra-samādhi !
My 3 year old vastly prefers complex carnatic music to cocomelon (and its ilk). He can listen to a 15 minute, intricate song without losing interest, and will ask for it in a loop. Children can handle a lot more complexity than we generally assume.
The keyword in title is "bullish". It's about the future.
Specifically I think it's about the potential of the transformer architecture & the idea that scaling is all that's needed to get to AGI (however you define AGI).
> Companies will keep pumping up LLMs until the day a newcomer puts forward a different type of AI model that will swiftly outperform them.
Deleted Comment
fastai is also amazing, but it's made of 1.5 hour videos, and is more freeflowing. By the time I even figured out where we stopped last time, my time would sometimes be up. It was very discouraging because of this. But later, once I got a little more time & some basic understanding from Andrew Ng, I was able to attempt fastai.