If you think they exist naturally, you're only looking at one of thousands of independent variables. If you average them out, we all tend towards mediocrity.
When someone appeals to hierarchies (e.g., "there's always a bigger fish"), they're just admitting to using a painfully one-dimensional worldview.
for sure, today’s LLMs lack the last two on the list, and there is probably a rational debate to be had whether these can just emerge in the substrate provided by the LLM-like setting or whether the brain provides some hardwired additions to loop around focus selection and perceive-model-decide-act that will need to be grafted on to LLMs to achieve AGI