Can you expand on this?
I think to get the full meaning of both, you'd need to be fairly steeped in a world that uses those words all the time AND it is often used to identify people who "get it" from those who don't.
Selectively pick and struggle through things you want to learn deeply. And let AI spoon-feed you for things you don't care as much about.
I've also wanted to play with C and Raylib for a long time and now I'm confident in coding by hand and struggling with it, I just use LLMs as a backstop for when I get frustrated, like a TA during lab hours.
Sure, the junior manager might use them vaguely to mimic, but IMHO, when vague language comes up at decision tables, it's usually coding something more precise in a sort of plausible deniability.
A senior manager on reviewing a proposal asks them to synergize with existing efforts: Your work is redundant you're wasting your time.
A senior director talks about better alignment of their various depts: We need to cut fat and merge, start identifying your bad players
etc etc.
If my impressions are correct, of course ICs are going to balk at these statements - they seem disconnected from reality and are magically disconnected from the effects on purpose. Yes, this is bad management to the ICs, but it's pretty culturally inevitable, I think, to have an in-group signalling their strategies using coded language.
A good manager takes this direction in front of all their ICs, laughs it off as corpo speak, but was given the signal to have a private talk with one of their group who triggered the problem... I dunno maybe my time in management was particularly distopian, but this seemed obvious once I saw it.
Conversely, when someone talks about "decolonizing" a curriculum or "centering" marginalized voices, to me it's a clear statement about who gets to define meaning and whose history counts, but to my Boomer uncle it's incoherent, if not an outright attack.
It's not all that different from the state of big corp software today! Large organizations with layers of management tend to lose all abiliy to keep a consistent strategy. They tend to go all in on a single dimension such as ROI for the next quarter, but it misses the bigger picture. Good software is about creating longer term value and takes consistent skill & vision to execute.
Those software engineers who focus on this big picture thinking are going to be more valuable than ever.
There are just so many small decisions that add up to a consistent vision for a piece of software. It doesn't seem like LLMs are going to be able to meaningfully contribute to that in the near future.
I tried vibecoding my own workout tracker, but there were so many small details to think through that it was frustrating. I gave up and found an app that is clearly made by a team of experienced, thoughtful people and AI can't replicate the sheer thoughtfulness of every decision that was made to create this app. The inputs for reps/sets, algorithms for adjusting effort on the fly, an exercise library with clear videos and explanations; there's just no way to replicate that without people who have been trainers and sport scientists for decades.
LLMs can help increase the speed that these details turn in to something tangible, but you definitely can't "skip all that crap and just jump to the end and get on with it."
Otherwise, very cool and exciting!
I wonder if is anyone working on an AI framework that encourages us to keep our eye on the big picture, then walk away when a reasonable amount of work is done for the day.
Yes, individuals are creating cool mobile coding solutions and Anthropic doesn't want to get left behind. I know I'm working my ass off at work right now because LLM coding makes it fun, but I also often don't prioritize what I'm doing for the big picture because I just try every thing that comes into my inbox, in order, because it's so fast to do with Claude Code.
We all sense it!: <https://newsroom.haas.berkeley.edu/ai-promised-to-free-up-wo...> <https://ghuntley.com/teleport/> <https://steve-yegge.medium.com/the-ai-vampire-eda6e4f07163>
Let's see how many days until something else tops it.
I can commiserate with this person cooking up a rant based on a faulty initial premise but it's a doozy. Kidnapping heads of state and indiscriminate bombing campaigns with massive collateral damage certainly don't fit my conception of "acting powerful."