No amount of fantastical thinking is going to coax AGI out of a box of inanimate binary switches --- aka, a computer as we know it.
Even with billions and billions of microscopic switches operating at extremely high speed consuming an enormous share of the world's energy, a computer will still be nothing more than a binary logic playback device.
Expecting anything more is to defy logic and physics and just assume that "intelligence" is a binary algorithm.
I generally agree with the article's point, though I think "Will Never Happen" is too strong of a conclusion, whereas I don't think the idea that simple components ("a box of inanimate binary switches") fundamentally cannot combine to produce complex behaviour is well-founded.
The logic and physics that make a computer what it is --- a binary logic playback device.
By design, this is all it is capable of doing.
Assuming a finite, inanimate computer can produce AGI is to assume that "intelligence" is nothing more than a binary logic algorithm. Currently, there is no logical basis for this assumption --- simply because we have yet to produce a logical definition of "intelligence".
Of all people, programmers should understand that you can't program something that is not defined.
Humans are also made up of a finite number of tiny particles moving around that would, on their own, not be considered living or intelligent.
> [...] we have yet to produce a logical definition of "intelligence". Of all people, programmers should understand that you can't program something that is not defined.
There are multiple definitions of intelligence, some mathematically formalized, usually centered around reasoning and adapting to new challenges.
There are also a variety of definitions for what makes an application "accessible", most not super precise, but that doesn't prevent me improving the application in ways such that it gradually meets more and more people's definitions of accessible.