Maybe his incisive polemic, which I greatly enjoy, was all but pandering to a certain elitist sensibility in the end.
To make manageable programs, you have to trade off execution speed both on the cpu and in the organization. His rather mathematized prescriptions imply we should hire quarrelsome academics such as him to reduce performance and slow down product development[initially…] all in the interest of his stratified sensibilities of elegance and simplicity.
Sucks to be right when that’s the truth.
Diagrams can be authoritative, and the ones I’ve seen will break some or all of these rules because they represent natural heuristics that practitioners are expected to fill inn themselves.
"go back to 0x00, then jump GOTO return register bits ahead"
you still have a variable, the goto return register.
it is your nested depth, the nesting limit til that register overflows.
please explain to me cuz im confused fr
In the aggregate, almost no programmer can think up code faster than they can type it in. But being a better typist still helps, because it cuts down on the amount you have to hold in your head.
Similar for automatically generating boilerplate.
> If I don't know, how can I trust the generated code to be correct?
Ask the AI for a proof of correctness. (And I'm only half-joking here.)
In languages like Rust the compiler gives you a lot of help in getting concurrency right, but you still have to write the code. If the Rust compiler approves of some code (AI generated or artisanally crafted), you are already pretty far along in concurrency right.
A great mind can take a complex problem and come up with a simple solution that's easy to understand and obviously correct. AI isn't quite there yet, but getting better all the time.
And thank god! Code is a liability. The price of code is coming down but selling code is almost entirely supplanted by selling features (SaaS) as a business model. The early cloud services have become legacy dependencies by now (great work if you can get it). Maintaining code is becoming a central business concern in all sectors governed by IT (i.e. all sectors, eating the world and all that).
On a per-feature basis, more code means higher maintenance costs, more bugs and greater demands on developer skills and experience. Validated production code that delivers proven customer value is not something you refactor on a whim (unless you plan to go out of business), and the fact that you did it in an evening thanks to ClippyGPT means nothing—-the costly part is always what comes after: demonstrating value or maintaing trust in a competitive market with a much shallower capital investment moat.
Mo’ code mo’ problems.
Ah, so functional reactive programs don’t suffer from stack overflow on recursion, they just suffer from memoisation overflow? ;)
Electronic circuits with feedback could be thought of as tail end recursion. :)
By all means, you should be familiar with the evaluation semantics of your runtime environment. If you don’t know the search depth of your input or can’t otherwise constrain stack height beforehand beware the specter of symbolic recursion, but recursion is so much more.
Functional reactive programs do not suffer from stack-overflow on recursion (implementation details notwithstanding). By Church-Turing every sombolic-recursive function can be translated to a looped iteration over some memoization of intermeduate results. The stack is just a special case of such memoization. Popular functional programming patterns provide other bottled up memoization strategies: Fold/Reduce, map/zip/join, either/amb/cond the list goes on (heh). Your iterating loop is still traversing the solution space in a recursive manner, provided you dot your ‘i’s and carry your remainders correctly.
Heck, any feedback circuit is essentially recursive and I have never seen an IIR-filter blow the stack (by itself, mind you).
And while you're at it: what is happening inside the mind when it reasons?