On a side note.. ya’ll must be prompt wizards if you can actually use the LLM code.
I use it for debugging sometimes to get an idea, or a quick sketch up of an UI.
As for actual code.. the code it writes is a huge mess of spaghetti code, overly verbose, with serious performance and security risks, and complete misunderstanding of pretty much every design pattern I give it..
Personally, I wrote 200K lines of my B2B SaaS before agentic coding came around. With Sonnet 4 in Agent mode, I'd say I now write maybe 20% of the ongoing code from day to day, perhaps less. Interactive Sonnet in VS Code and GitHub Copilot Agents (autonomous agents running on GitHub's servers) do the other 80%. The more I document in Markdown, the higher that percentage becomes. I then carefully review and test.
Dystopian, infuriating, unethical and immoral.
Look at the language Coinbase uses. Only their view is a "belief." The opposing view is a "worry." Others are motivated by fear. Only holy Coinbase is motivated by love!
This is, of course, doublethink. We all know that removing humans from the hiring process is, by definition, dehumanizing.
Coinbase's article would have been more palatable if it were truthful:
> Some believe AI will dehumanize the hiring process. We agree, and we're SO excited about that! I mean, we aren't in this business to make friends. We're in it to make cold, hard cash. And the less we have to interact with boring, messy human beings along the way, the better! If you're cold, calculating and transactional like us, sign on the dotted line, and let's make some dough!
But if they were that truthful, fun, and straightforward, they'd probably be more social, and they wouldn't have this dehumanizing hiring process to begin with.
It really doesn't matter what "they" said about books. We are talking about screen time. And screen time has measurably harmful effects on child development.
It leads to worse outcomes across the board. Sleep disorders. Obesity. Mental health disorders. Depression. Anxiety. Decreased ability to interpret emotions. Aggressive conduct. And this is to say nothing of ADHD (7.7 times higher likelihood in the heaviest screen users) or social media's effects on adolescents. [1][2]
[1] https://pmc.ncbi.nlm.nih.gov/articles/PMC10353947/
[2] https://www.webmd.com/add-adhd/childhood-adhd/childhood-adhd...
> This isn’t accidental. It’s cultural.
> We’re not innovating. We’re rebuilding broken versions of tools the web already gave us – and doing it badly.
> We’re not iterating toward impact – we’re iterating just to stay afloat.
It's not X, it's Y? Dashes? Question fragments? This style isn't just tedious—it's a hallmark of LLM-generated content.
The whole article feels like low-effort LLM-generated clickbait to fan the eternal flamewar between web developers and web app developers. Yes, you might not need React for a static blog. Yes, React is useful for writing web applications. Can we talk about something else now?
Examples:
- Screensharing in Teams. There was a gaussian blur over everything. I had this happen during a work call.
- Nvidia. I kept getting screen-tearing. I went through various guides, installed drivers and so on, but it never worked properly.
- Office. LibreOffice slaughters my Office docs. The formatting is wrong, things are broken.
- Media. I had issues watching things that I could just watch on Windows.
Those kinds of issues were fun to me 20 years ago; they were part of the adventure of roughing it and sticking it to the man. Today, I don't have the time or energy. I'd rather use an OS that Just Works. When I need Linux, WSL has worked great.
I've cancelled my copilot subscription last week and when it expires in two weeks I'll mostly likely shift to local models for autocomplete/simple stuff.
That said, months ago I did experience the kind of slow agent edit times you mentioned. I don't know where the bottleneck was, but it hasn't come back.
I'm on library WiFi right now, "vibe coding" (as much as I dislike that term) a new tool for my customers using Copilot, and it's snappy.
Deleted Comment