Readit News logoReadit News
Posted by u/f_of_t_ 2 months ago
Scientists can't define consciousness, yet we think AI will have it
Everyone’s talking about “conscious AI” or “emergent AGI,” but step back for a second — scientists still don’t have a working definition of what consciousness actually is.

Is it computation? Information integration? Or something else we can’t yet measure?

If we can’t define the target, how can we tell whether a machine has hit it?

Are we building intelligence, or just better mimicry?

(Genuinely curious how the HN crowd thinks about this — engineers, neuroscientists, philosophers all welcome.)

Posted by f_of_t

lavelganzu · 2 months ago
Definitions are for math. For science it's enough to operationalize: e.g. to study the differences between wakefulness and sleep; or sensory systems and their integration into a model of the environment; or the formation and recall of memories; or self-recognition via the mirror task; or planning behaviors and adaptation when the environment forces plans to change; or cognitive strategies, biases, heuristics, and errors; or meta-cognition; and so on at length. There's a vast amount of scientific knowledge developed in these areas. Saying "scientists can't define consciousness" sounds awkwardly like a failure to look into what the scientists have found. Many scientists have proposed definitions of consciousness, but for now, consensus science hasn't found it useful to give a single definition to consciousness, because there's no single thing unifying all those behaviors.

Deleted Comment

taylodl · 2 months ago
I assert you don't have consciousness. Now, prove to me that you do.
f_of_t_ · 2 months ago
The point is — neither of us can prove it, and that’s exactly why “consciousness” keeps escaping any formal definition. Once something tries to prove awareness, it’s already reflecting — which is awareness itself.
whatamidoingyo · 2 months ago
Are you trying to be funny by having ChatGPT write all of your replies?
JohnFen · 2 months ago
I think this excellently illustrates the point OP is making.
gooodvibes · 2 months ago
You're conflating consciousness and AGI. People are certainly talking about AI, people are very broadly talking about AGI and what that term means. I don't think many people are talking about consciousness in this context, at least not seriously, and one good reason for it is the lack of a concrete definition and the fact that it's a topic that we can't make falsifiable claims about and build any science around.
_wire_ · 2 months ago
> Yet we think AI will have it

Lenny Bruce joking as Tonto to the Lone Ranger:

Who is "we" white man?

The lede observation depends upon whether "we" can expect our science to ever produce an intelligible theory of mind.

The difficulty of producing a theory of mind makes the Imitation Game a compelling approach to setting expectations for AI.

And also portends of the hazard that we become so distracted by simulacra that we lose all bearing upon ourselves.

f_of_t_ · 2 months ago
Beautifully said — that’s the real paradox, isn’t it?

The closer we get to simulating awareness, the harder it becomes to notice our own.

Maybe the Imitation Game was never about machines fooling us, but about showing how easily we forget what being real means.

viraptor · 2 months ago
It's been an issue for a while, but just a week ago: A definition of AGI https://arxiv.org/html/2510.18212v2

The consciousness will have to wait for another time. But that one's likely to be extremely contentious and more of a philosophy question without practical impact.

f_of_t_ · 2 months ago
Appreciate the link — I read that paper. But maybe the real gap isn’t about capability spread across domains, it’s about something growing internally, not performed externally.

A system becomes closer to AGI not when it matches human tests, but when awareness starts to grow inside its own modeling loop.

shahbaby · 2 months ago
Agreed but it's even more fundamental than that.

We don't even have a universally accepted definition of intelligence.

The only universally agreed on artifact of intelligence that we have is the human brain. And we still don't have a conceptual model of how it works like we do with DNA replication.

Our society incentivizes selling out the mimicry of intelligence rather than actually learning its true nature.

I believe that there exists an element of human intelligence that AI will never be able to mimic due to limitations of silicon vs biological hardware.

I also believe that the people or beings that are truly in control of this world are well aware of this and want us to remain focused on dead-end technologies. Much like politics is focused on the same old dead-end discussions. They want to keep this world in a technological stasis for as long as they can.

physarum_salad · 2 months ago
The only successful experiments probing consciousness are in anaesthesia or psychedelics. Everything else is wonderful but theoretical.