My best bet now may be to move to orbit like S.R. Hadden. But it'll have to be high orbit, away from the satellite constellations.
My best bet now may be to move to orbit like S.R. Hadden. But it'll have to be high orbit, away from the satellite constellations.
Do lawyers still really believe they can just throw some legal jargon at laypeople and we will just get confused and back down? Like not only do we have every single law and legal precedent on a device in our pocket, we also have AI's that can instantly answer questions. I am sure shit like that might have worked before 2010 when you would have to scramble to figure out if what they were saying was true or not, but it just seems antiquated to attempt it nowadays.
Frameworks and compilers are designed to be leak-proof abstractions. Any way in which they deviate from their abstract promise is a bug that can be found, filed, and permanently fixed. You get to spend your time and energy reasoning in terms of the abstraction because you can trust that the finished product works exactly the way you reasoned about at the abstract level.
LLMs cannot offer that promise by design, so it remains your job to find and fix any deviations from the abstraction you intended. If you fell short of finding and fixing any of those bugs, you've just left yourself a potential crisis down the line.
[Aside: I get why that's acceptable in many domains, and I hope in return people can get why it's not acceptable in many other domains]
All of our decades of progress in programming languages, frameworks, libraries, etc. has been in trying to build up leak-proof abstractions so that programmer intent can be focused only on the unique and interesting parts of a problem, with the other details getting the best available (or at least most widely applicable) implementation. In many ways we've succeeded, even though in many ways it looks like progress has stalled. LLMs have not solved this, they've just given up on the leak-proof part of the problem, trading it for exactly the costs and risks the industry was trying to avoid by solving it properly.
If my C compiler sometimes worked and sometimes didn't I would just mash compile like an ape until it started working.
its obviously not wrong to fly over the desert in a helicopter. its a means to an end and can be completely preferable. I mean myself I'd prefer to be in a passenger jet even higher above it, at a further remove personally. But I wouldn't think that doing so makes me someone who knows the desert the same way as someone who has crossed it on foot. It is okay to prefer and utilize the power of "the next abstraction", but I think its rather pig headed to deny that nothing of value is lost to people who are mourning the passing of what they gained from intimate contact with the territory. and no it's not just about the literal typing. the advent of LLMs is not the 'end of typing', that is more reductionist failure to see the point.
The idea that you lose a ton of knowledge when you experience things through intermediaries is an old one.
I think just as hard, I type less. I specify precisely and I review.
If anything, all we've changed is working at a higher level. The product is the same.
But these people just keep mixing things up like "wow I got a ferrari now, watch it fly off the road!"
Yeah so you got a tools upgrade; it's faster, it's more powerful. Keep it on the road or give up driving!
We went from auto completing keywords, to auto completing symbols, to auto completing statements, to auto completing paragraphs, to auto completing entire features.
Because it happened so fast, people feel the need to rename programming every week. We either vibe coders now, or agentic coders or ... or just programmers hey. You know why? I write in C, I get machine code, I didn't write the machine code! It was all an abstraction!
Oh but it's not the same you say, it changes every time you ask. Yes, for now, it's still wonky and janky in places. It's just a stepping stone.
Just chill, it's programming. The tools just got even better.
You can still jump on a camel and cross the desert in 3 days. Have at it, you risk dying, but enjoy. Or you can just rent a helicopter and fly over the damn thing in a few hours. Your choice. Don't let people tell you it isn't travelling.
We're all Linus Torvalds now. We review, we merge, we send back. And if you had no idea what you were doing before, you'll still have no idea what you're doing today. You just fat-finger less typos today than ever before.
It isn't an abstraction like assembly -> C. If you code something like: extract the raw audio data from an audio container, it doesn't matter if you write it in assembly, C, Javascript, whatever. You will be able to visualize how the data is structured when you are done. If you had an agent generate the code the data would just be an abstraction.
It just isn't worth it to me. If I am working with audio and I get a strong mental model for what different audio formats/containers/codecs look like who knows what creative idea that will trigger down the line. If I have an agent just fix it then my brain will never even know how to think in that way. And it takes like... a day...
So I get it as a optimized search engine, but I will never just let it replace understanding every line I commit.
Go ahead - I'm ready to be down-voted again and again until folks realize it is inevitable, as is inevitable that many companies in the area of business software are going down down down.
Sure this is awesome now and maybe he shipped it in a week using AI or something, but he now owns a critical part of his wife's business. 5 years from now he is gonna be working 50/hrs a week and not want to deal with this project he barely remembers even doing, whenever an SSL cert goes bad or the CC he was paying the server bills with expires or actual bugs happen he is on the line for it.
It is lame to let family/friends pay $20/mo for something you could build in a few weeks, but they will own the product forever, I don't want to.
The new tech is likely just for noisy environments and/or to enable whispered voice control of the phone.
I think you're anthropomorphising the AI too much: what does it mean for an LLM to have psychosis? This implies that LLMs have a soul, or a consciousness, or a psyche. But... do they?
Speaking of reality, one can easily become philosophical and say that we humans don't exactly "have" a reality either. All we have are sensor readings. LLMs' sensors are texts and images they get as input. They don't have the "real" world, but they do have access to tons of _representations_ of this world.
The waits are unpredictable length, so you never know if you should wait or switch to a new task. So you just do something to kill a little time while the machine thinks.
You never get into a flow state and you feel worn down from this constant vigilance of waiting for background jobs to finish.
I dont feel more productive, I feel like a lazy babysitter that’s just doing enough to keep the kids from hurting themselves