Why not? I’m not trying to inflame this further, I’m genuinely interested in your logic for this statement.
Dead Comment
- Get more money than what I'm currently making and by that, I mean at least 20% more
I've no problem telling them my current salary since it's always been in the upper quartile for my area. To get me to switch, the increase has to be significant or else it's not worth the risk of plunging into the unknown and the pain of learning yet another completely different spaghetti mess where generations of architecture astronauts added layers upon layers until the complexity exceeded their mental capacity to maintain it and left.
I've worked on Titling Systems, Game Development (C/C++), Integration Systems, and Backend database systems. All those niche data models/systems live rent free in my head. It is all absolutely worthless to my current employer or people around me because they're focused on solving their unique problems which at the end of the day just become another piece of worthless business procedure in my head. It is worthless because of the fact that business and people only care about solving their problem, once its solved they just move onto the next.
But I think the author's point is apt. There are a bunch of social issues that will arise or worsen when people can plug themselves into a world of their choosing instead of having to figure out how to deal with this one.
> Now this belief system encounters AI, a technology that seems to vindicate its core premise even more acutely than all the technologies that came before it. ChatGPT does respond to your intentions, does create any reality you prompt it to imagine, does act like a spiritual intelligence
This goes beyond spirituality of course. AI boyfriend/girlfriend, infinite AAA-grade content, infinite insta feeds at much higher quality and relevance levels than current; it’s easy to see where this is going, harder to see how we stay sane through it all.
Question: How do people figure out how to deal with this world?
Answer: People choose to plug themselves into a world of their choosing.
AI models do not lie, nor do they tell the truth. They synthesize character or pixel data according to complex algorithms and datasets running on silicon hardware. It's up to us humans to use our decidedly non-computer minds to interpret that output data as something which means either truth or falsehood (which itself is a whole separate debate over how we can know what is true, etc.).
"I tried the local Iranian market. I showed it to friends, family, and potential clients. Their response: "Nobody in Iran will pay $500/month for this. The Persian language quality isn't perfect. We'll use free ChatGPT instead.""
Which should of been free feed-back on the risk vs reward.