Readit News logoReadit News

Loading parent story...

Loading comment...

Loading parent story...

Loading comment...

Loading parent story...

Loading comment...

ZoomZoomZoom commented on A definition of AGI   arxiv.org/abs/2510.18212... · Posted by u/pegasus
fnordpiglet · 2 months ago
After reading the paper I’m struck by the lack of any discussion of awareness. Cognition requires at its basis awareness, which due to its entirely non verbal and unconstructed basis, is profoundly difficult to describe, measure, quantify, or label. This makes it to my mind impossible to train a model to be aware, let alone for humans to concretely describe it or evaluate it. Philosophy, especially Buddhism, has tried for thousands of years and psychology has all but abandoned attempting so. Hence papers like this that define AGI on psychometric dimensions that have the advantage of being easily measured but the disadvantage of being incomplete. My father is an emeritus professor of psychometrics and he agrees this is the biggest hurdle to AGI - that our ability to measure the dimensions of intelligence is woefully insufficient to the task of replicating intelligence. We scratch the surface and his opinion is language is sufficient to capture the knowledge of man, but not the spark of awareness required to be intelligent.

This isn’t meant to be a mystical statement that it’s magic that makes humans intelligent or some exotic process impossible to compute. But that the nature of our mind is not observable in its entirety to us sufficient that the current learned reinforcement techniques can’t achieve it.

Try this exercise. Do not think and let your mind clear. Ideas will surface. By what process did they surface? Or clear your mind entirely then try to perform some complex task. You will be able to. How did you do this without thought? We’ve all had sudden insights without deliberation or thought. Where did these come from? By what process did you arrive at them? Most of the things we do or think are not deliberative and definitely not structured with language. This process is unobservable and not measurable, and the only way we have to do so is through imperfect verbalizations that hint out some vague outline of a subconscious mind. But without being able to train a model on that subconscious process, one that can’t be expressed in language with any meaningful sufficiency, how will language models demonstrate it? Their very nature of autoregressive inference prohibits such a process from emerging at any scale. We might very well be able to fake it to an extent that it fools us, but awareness isn’t there - and I’d assert that awareness is all you need.

ZoomZoomZoom · 2 months ago
We don't want awareness because it begets individuals by means of agency and we'd need to give them rights. This is industry's nightmare scenario.

People want autonomy, self-learning, consistent memory and perhaps individuality (in the discernability/quirkiness sense), but still morally unencumbered slaves.

Loading parent story...

Loading comment...

Loading parent story...

Loading comment...

Loading parent story...

Loading comment...

ZoomZoomZoom commented on EQ: A video about all forms of equalizers   youtube.com/watch?v=CLAt9... · Posted by u/robinhouston
ZoomZoomZoom · 2 months ago
The second sentence in the video hides a bit of complexity. Pink noise isn't a straight line on the spectrum analyzer unless a correction slope is active, and the common settings for the slope (3, 4.5) are just a convention.

I guess, that's why it says "you can make it look like a straight line".

Loading parent story...

Loading comment...

Loading parent story...

Loading comment...

u/ZoomZoomZoom

KarmaCake day2007December 24, 2014
About
indiscipline.github.io thirdhemisphere.studio
View Original