All major cloud providers have high profit margins in the range of 30-40%.
I take all of my notes on ChatGPT now, and for more structured data I have built specialized tools/agents, and even small front ends like for my portfolio management track.
It’s crazy how different my world used to be back then; note apps feel so primitive now.
It's an abstraction and convenience to avoid fiddling with registers and memory and that at the lowest level.
Everyone might enjoy their computation platform of their choice in their own way. No need to require one way nor another. You might feel all fired up about a particular high level language that you think abstracts and deploys in a way you think is right. Not everyone does.
You don't need a programming language to discover yourself. If you become fixated on a particular language or paradigm then there is a good chance you have lost sight of how to deal with what needs dealing with.
You are simply stroking your tools, instead of using them properly.
Then why are we using them to write code, which should produce reliable outputs for a given input...much like a calculator.
Obviously we want the code to produce correct results for whatever input we give, and as it stands now, I can't trust LLM output without reviewing first. Still a helpful tool, but ultimately my desire would be to have them be as accurate as a calculator so they can be trusted enough to not need the review step.
Using an LLM and being OK with untrustworthy results, it'd be like clicking the terminal icon on my dock and sometimes it opens terminal, sometimes it might open a browser, or just silently fail because there's no reproducible output for any given input to an LLM. To me that's a problem, output should be reproducible, especially if it's writing code.