you must be completely LLMheaded to say something like that, lol
humans are not trained on spacial data, they are living in the world. humans are very much diffent from silicone chips, and human learning is on another magnitude of complexity compared to a large language model training
If this hurts your ego then just know the dataset that you built your ego with was probably flawed and if you can put that LoRA aside and try to process this logically; Our awareness is a scalable emergent property of 1-2 decades of datasets, looking at how neurons vs transistor groups work, there could only be a limited amount of ways to process these sizes of data down to relevant streams. The very fact that training LLMs on our output works, proves our output is a product of LLMs or there wouldn't be patterns to find.
I know I'm too late to ask this question, But I suspect its either; Feelings and intuitions, which is just a primitive IQ test. Or some kind of aptitude test, which is just a different flavor of IQ test.