Readit News logoReadit News
pedalpete · 4 months ago
I believe that training a system to understand the electrical signals that define a movement is significantly different from a system that understands thought.

I work in neurotech, I don't believe that the electrical signals of the brain define thought or memory.

When humans understood hydro-dynamics, we applied that understanding to the body and thought we had it all figured out. The heart pumped blood, which brought nutients to the organs, etc etc.

When humans discovered electricity, we slapped ourselves on the forehead and exclaimed "of course!! it's electric" and we have now applied that understanding on top of our previous understanding.

But we still don't know what consciousness or thought is, and the idea that it is a bunch of electrical impulses is not quite proven.

There are electrical firing of neurons, absolutely, but do they directly define thought?

I'm happy to say we don't know, and that "mind-reading" devices are yet un-proven.

A few start-ups are doing things like showing people images while reading brain activity and then trying to understand what areas of the brain "light-up" on certain images, but I think this path will prove to be fruitless in understanding thought and how the mind works.

Night_Thastus · 4 months ago
Agree completely. The brain is so incredibly complex that we've barely scratched the surface. It's not just neurons, which are very complex and vary wildly in genetics between them - it's hundreds of other helper cells all interacting with each other in sometimes bizarre ways.

To try to boil down it all to any simple signal is just never going to work. If we want to map consciousness it's going to be as complex as simulating it ourselves, creating something as dense and detailed as a real brain.

thegrey_one · 4 months ago
I don't think it's anything other than electric activity, but it's clearly not "some electrical signal". It's the totality of them. They are many, and complicated. And they seem to be required for consciousness. Doubt there's any proven conscious state in a human, lacking electrical activity in the brain.

Deleted Comment

PaulRobinson · 4 months ago
We know that the brain is a structure that works through electrochemical reactions. Synapses transmit signals sent by axons to neurons. We can test this. We can measure it. There's nothing else going on that we can describe using known science.

Ah, we might say, maybe there is an unknown science - we did not know about so much before, like electricity, like X-Rays, like quantum physics, and then we did, and the World changed.

The difference is that we observed something that science could not explain, and then we found the new science that explained it, and a new science was born.

It's pretty clear to me - but you may know more - that we can explain all brain activity through known science. It might be hard to think of us as nothing more than a bunch of electrochemical reactions in a real-world reinforcement learning system, but that's what we are: there's no gap that needs new science, is there?

lukeinator42 · 4 months ago
Scalp-recorded EEG does not measure action potentials, it can only measure the graded potentials of basically one type of neuron (pyramidal cells) in the cortex, which is a really tiny percentage of both neurons and electrical activity in the brain. Additionally, there is also the various roles neurotransmitters play in the brain, etc., and glial cells seem to also play an important role. So, it’s definitely not the case that there aren’t any gaps that need new science, and even if there weren’t, it’s a pretty big stretch from there to decoding all brain activity solely through the electrical component.
opello · 4 months ago
It seems neatly organized to say "that we can explain all brain activity" and yet not necessarily bound exactly what is "brain activity." I think prior to recent research [1] people would have concluded that memory was solely the domain of the brain. But that sense/setting/environment would allow Clive Wearing to circumvent amnesia to access skills otherwise unavailable to his conscious mind [2] should raise questions of that understanding.

[1] https://www.nyu.edu/about/news-publications/news/2024/novemb...

[2] https://en.wikipedia.org/wiki/Clive_Wearing

estimator7292 · 4 months ago
No, none of this is settled. We cannot adequately explain brain function with current science.

There have been studies this year implying that some brain functions rely on quantum interactions.

JumpCrisscross · 4 months ago
> We know that the brain is a structure that works through electrochemical reactions. Synapses transmit signals sent by axons to neurons. We can test this. We can measure it. There's nothing else going on that we can describe using known science

But what we can describe using known science doesn't describe the system. That doesn't mean the vacuum is voodoo. It's just a strong hint something more is going on. (Like the photoelectric effect.)

We know more about dark energy and matter than the dark essence that separates our leading electrochemical models from consciousness.

jaapz · 4 months ago
Can we? We can only see whatever we can measure with the tools we currently have, which are based on the knowledge we currently have. Who's to say there isn't something out there we haven't discovered yet? There's more than enough we still don't understand in many domains of science
behringer · 4 months ago
I think there is new science we need first. The brain very likely uses quantum processes. We don't understand quantum mechanics yet.

Dead Comment

quantummagic · 4 months ago
> There are electrical firing of neurons, absolutely, but do they directly define thought?

Well, surgeons and researchers have shown that electrical stimulation of certain brain regions, can induce "perception" during procedures. They can make a patient have the conscious experience of certain smells, for instance.

It's not conclusive proof of anything, but I wouldn't bet against us getting closer to the mark, than we were when we only considered hydro-dynamics as the model.

JumpCrisscross · 4 months ago
> surgeons and researchers have shown that electrical stimulation of certain brain regions, can induce "perception" during procedures

I can carefully drop liquid reactants on a storage medium and induce nontrivial and reproducible changes in any computer reading it. That doesn't tell me how digital storage works, it just says I'm proximate to the process.

saboot · 4 months ago
It goes far beyond smells, in ways I find deeply unsettling

We can induce religious experience, see "The God Helmet"

https://en.wikipedia.org/wiki/God_helmet

or deep depression & suicidal thoughts

https://www.nejm.org/doi/full/10.1056/NEJM199905133401905

dboreham · 4 months ago
> I don't believe that the electrical signals of the brain define thought or memory.

Yes and no. It'll be something like a JPEG file. You can have a JPEG file that contains an image of a cat. But give that file to someone who has no clue about JPEG encoding and the file looks like random noise. They'll take 100 years to figure out it's an image of a cat.

Actually it's like if you take an electron beam prober to one of the NVidia AI GPU chips while it's figuring out whether it likes Wordsworth poetry.

WhitneyLand · 4 months ago
You say you don’t believe something is true and then say you don’t know, but I’ll disagree with “electrical signals don’t “define” (encode) thoughts.

To be clear, of course it’s true that our thoughts are more than just electrical activity. The brain is a system. However, it seems clear that thoughts are at least partially encoded in electrical activity.

What you mentioned those startups will find fruitless, that’s already been done for years in a research setting. It may not be a successful business model, but it’s already been demonstrated.

There are fMRI studies and electrical measurement studies. You could argue fMRI decoding of images is not electrical activity which is true, but a bunch or work shows they are strongly correlated.

For electrical activity alone we’re already decoding information like words, so it’s hard to claim electrical activity doesn’t define thoughts.

Maybe you mean to say, doesn’t define all the content of our thoughts which is a much different claim.

biff1 · 4 months ago
Well, if you are making the assertion, which you implicitly seem to be, you must first define thought. Is a word == thought? And of correlations, we all know the adage about correlation and causation. Not that I would make the counter argument, that thought is not encoded by electrical signals, but I would bet you aren’t totally correct. Do you think there will be no future paradigm shifts?
vbezhenar · 4 months ago
For me it's like attaching wires to CPU and trying to decipher what youtube video is being playing right now.

Absolutely not possible.

pedalpete · 4 months ago
That is such a great analogy!

Another I heard is that measuring EEG is like standing outside a stadium during a match and listening to the roar of the crowd.

Reading thoughts through EEG is like standing outside the stadium, listening for the roar of the crowd, and based on what you hear, knowing what the umpire's mother-in-law had for breakfast.

layer8 · 4 months ago
One thing is probably true: You have to train on the individual person, and it’s not transferable to a different person. Similar to how when taking an LLM and training on the fluctuations of its neural network to “read its thoughts”, the training results won’t transfer to interpreting the semantic contents of the network activity of a different LLM.

So you probably can’t build a universal mind-reading device.

JamesBarney · 4 months ago
You can't build a universal mind-reading device that doesn't require calibration.
baxtr · 4 months ago
This sounds logical and convincing.

At the same time, it should also be easy to falsify.

Has there been an experimental setup like this tested? If I’m not mistaken it should falsify your claim.

Train a decoder on rich neural recordings, then test it on entirely new thoughts chosen under blinded conditions.

If it can still recover the precise unseen content from signals alone, the claim that electrical activity is insufficient is overturned.

plastic-enjoyer · 4 months ago
> Train a decoder on rich neural recordings, then test it on entirely new thoughts chosen under blinded conditions.

There have been enough studies about this and the result is mostly the same: it's difficult to nearly impossible to reliable decode neural recordings that differ from the distribution of neural recordings that the decoder was trained on. There are a lot of reasons why this happens, electrical activity being insufficient is not one of them.

PunchyHamster · 4 months ago
seems like trying to take a single pixel signal (so to speak) and interpolate entire image out of it.
47282847 · 4 months ago
I thought it was pretty established by now that it is likely that other parts of the body participate in both memory and thought, a fully distributed system?
fainpul · 4 months ago
Does it make sense to think of thoughts, consciousness etc. as an emergent property of the neuronal activity in our brains?
observationist · 4 months ago
This is silly. It's the sum of electrical and chemical network activity in the brain. There's nothing else it can be. We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.

Your mind is the state of your brain as it processes information. It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.

There's no scientific evidence that anything more is needed to explain everything the mind and brain does. Electrical and chemical signaling activity is sufficient. We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain. The scale of our experiments has been gross, only able to read and write from large numbers of neurons, but all the evidence is consistent.

There's not a single rigorously documented phenomenon, experiment, or any data in existence that suggests anything more than electrical and chemical signaling is needed to explain the full and wonderful and awe-inspiring phenomenon of the human mind.

It's the brain. We are self constructing software running on 2lb chunks of fancy electric meat stored in a bone vat with a sophisticated network of sensors and actuators in a wonderful biomechanical mobility platform that empowers us to interact with the world.

It explains consciousness, intelligence, qualia, and every other facet and nuance of the phenomena of mind - there's no need to tack on other explanations. It'd be like insisting that gasoline also requires the rage of fire spirits in order to ignite and power combustion engines - once you get to the point of understanding chemical combustion and expansion of gases and transfer of force, you don't need the fire spirits. They don't bring anything to the table. The scientific explanation is sufficient.

Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle. We don't need fire spirits animating cortical stacks, or phlogiston or ether or spirit.

Could spirit exist as a distinct, separate phenomenon? Sure. It's not intrinsic to subjective experience, consciousness, and biological intelligence, though, and we should use tools of rational thinking when approaching these subjects, because a whole lot of pseudo-scientific BS gets passed as legitimate scientific and philosophical discourse without having any firm grounding in reality.

We are brains in bone vats - nothing says otherwise. Unless or until there's evidence to the contrary, let that be enough.

Night_Thastus · 4 months ago
I think you misunderstood the person you're responding to. They did not say there was some higher force beyond the physical pieces.

What they're saying is that the brain is really really complicated and our understanding of biology is far too rudimentary right now to be saying "yes, absolutely, 100% sure that we know the nature of consciousness from this one measurement of one type of signal".

* Neurons are very complex and all have unique mutations from one another

* Hundreds of other types of cells in the brain interact with them and each other in ways we don't understand

* The various other parts of the body chemically interact with the brain in ways we don't understand yet, like the gut microbiome

Trying to flatten all of consciousness to one measurement is just not sufficient. It's like trying to simulate the entire planet as a perfect sphere of uniform density. That works OK for some things but falls apart for more complex questions.

wat10000 · 4 months ago
There’s nothing in known physics that explains consciousness. I agree about the rest, but consciousness not only defies explanation by known physics, it’s so far beyond what’s known that there isn’t even any concept of what it could be. We barely have the ability to describe it, let alone explain it.
HAL3000 · 4 months ago
> Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle.

Where did you get that? That's not an established scientific theorem, it's a philosophical stance (strong physicalist functionalism) expressed as if it were empirical fact. We cannot simulate a full human brain at the correct level of detail, record every spike and synaptic change in a living human brain and we do not have a theory that predicts which neural organizations are conscious just from first principles of physics and network topology.

> We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain

That shows dependence of experience on brain activity but dependence is not the same thing as reduction or explanation. We know certain neural patterns correlate with pain, color vision, memories, etc. we can causally influence experience by interacting with the brain.

But why any of this electrical/chemical stuff is accompanied by subjective experience instead of just being a complex zombie machine? The ability to toggle experiences by toggling neurons shows connection and that's it, it doesn't explain anything.

> We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.

We do have a good handle on how non conscious physical systems behave (engines, circuits, planets, whatever) But we don't have any widely accepted physical theory that derives subjective experience from physical laws. We don't know which physical/computational structures (if any) are sufficient and necessary for consciousness.

You are assuming without any evidence that current physics + it's "all computation" already gives a complete ontology of mind. So what is the consciousness? define it with physics, show me equations, you can't.

> It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.

We design transformer architectures, we set the training objectives, we can inspect every weight and activation of a LLM. Yet even with all that access, tens of thousands of ML PhDs,years of work and we still don't fully understand why these models generalize the way they do, why they develop certain internal representations and how exactly particular concepts are encoded and combined.

If we struggle to interpret a ~10^11 parameter transformer whose every bit we can log and replay, it's a REAL hubris to act like we've basically got a 10^14-10^15 synapse constantly rewiring, developmentally shaped biological network to the point of confidently saying "we know there's nothing more to mind than this, case closed lol".

Our ability to observe and manipulate the brain is currently far weaker than our ability to inspect artificial nets and even those are not truly understood at a deep mechanistic concept level explanatory sense.

> Your mind is the state of your brain as it processes information.

Ok but then you have a problem, if anything that processes information is a computer, and mind is "just computation" then which computations are conscious?

Is my laptop conscious when it runs a big simulation? Is a weather model conscious? Are all supercomputers conscious by default just because they flip bits at scale?

If you say yes, you've gone to an extreme pancomputationalism that most people (including most physicalists) find extremely implausible.

If you say no, then you owe a non hand wavy criterion, what's the principled difference, in purely physical/computational terms between a conscious system (human brain) and a non conscious but still massively computational system (weather simulation, supercomputer cluster)? That criterion is exactly the kind of thing we don't have yet.

So saying "it’s just computation" without specifying which computations and why they give rise to a first person point of view leaves the fundamental question unanswered.

And one more thing your gasoline analogy is misleading, combustion never presented a "hard problem of combustion" in the sense of a first person, irreducible qualitative aspect. People had wrong physical theories, but once chemistry was in place, everything was observable from the outside.

Consciousness is different, you can know all the physical facts about a brain state and still not obviously see why it should feel like anything at all from the inside.

That's why even hardcore physicalist philosophers talk about the "explanatory gap". Whether or not you think it's ultimately bridgeable, it's not honest to say the gap is already closed and the scientific explanation is "sufficient".

Terr_ · 4 months ago
From some dystopic device log:

    [alert] Pre-thought match blacklist: 7f314541-abad-4df0-b22b-daa6003bdd43
    [debug] Perceived injustice, from authority, in-person
    [info]  Resolution path: eaa6a1ea-a9aa-42dd-b9c6-2ec40aa6b943
    [debug] Generate positive vague memory of past encounter
Not a reason to stop trying to help people with spinal damage, obviously, but a danger to avoid. It's easy to imagine a creepy machine argues with you or reminds you of things, but consider how much worse it'd be if it derails your chain of thought before you're even aware you have one.

fainpul · 4 months ago
This reminds me of "Upgrade" – a sci-fi movie about a paralized man who gets an AI brain implant, which can move his body for him. It's pretty decent.

https://www.imdb.com/title/tt6499752

foobarian · 4 months ago
Also "Common People," first episode in season 7 of Black Mirror. One word: ads [1]

[1] https://en.wikipedia.org/wiki/Common_People_(Black_Mirror)

callamdelaney · 4 months ago
Can you imagine having chatgpt in your brain to constantly police wrongthink? Would save the British media a job.
tim333 · 4 months ago
It might be able to react fast enough to prevent the horror of the wrongthink reaching twitter.
ftrsprvln · 4 months ago
When it infers illicit intent, it "corrects" you by biasing the output: a misclick here, a poisoned verb there... phantom intention drift™ injected into your parietal lobe milliseconds before your conscious even boots.
iberator · 4 months ago
You should make text based game
guiand · 4 months ago
Split brain experiments show that a person rationalizes and accommodates their own behavior even when "they" didn't choose to perform an action[1]. I wonder if ML-based implants which extrapolate behavior from CNS signals may actually drive behavior that a person wouldn't intrinsically choose, yet the person accommodates that behavior as coming from their own free will.

[1]: "The interpreter" https://en.wikipedia.org/wiki/Left-brain_interpreter

brnaftr361 · 4 months ago
Split brain experiments have been called into question.[0]

[0]: https://www.sciencedaily.com/releases/2017/01/170125093823.h...

comboy · 4 months ago
> The patients could accurately indicate whether an object was present in the left visual field and pinpoint its location, even when they responded with the right hand or verbally. This despite the fact that their cerebral hemispheres can hardly communicate with each other and do so at perhaps 1 bit per second

1 bit per second and we are passing complex information about location in 3d space?

empath75 · 4 months ago
That's a great paper, but I don't think it calls into question anything about post-hoc rationalizations, and it might actually put that idea on more solid ground.
pinkmuffinere · 4 months ago
Wow this is fascinating, and gets rid of one of my eldritch memetic horrors. Thanks for sharing, I’m going to submit it as its own post as well!
zh3 · 4 months ago
AI following the Libet ([0]1983) paper about preconscious thought apparently preceding 'voluntary' acts (which really elevated the question of what 'freewill' means).

* [0] https://pubmed.ncbi.nlm.nih.gov/6640273/

Lerc · 4 months ago
The prima facie case for free will* is that it feels free. If you can predict the action before the feeling it removes that argument (unless you want to invoke time travel as an option)

*one of the predominant characterisations of free will, anyway. I'm a compatiblist, so I have no issue with caused feelings of decision making being in conflict with free will. I also have a variation of Tourette's, so I have a different perception of doing things wilfully when compared to most people. It's really hard to describe how sometimes you can't tell if something was tic or not.

immibis · 4 months ago
I don't see why having some latency in the path of free will makes it no longer free. Before my arm moves up, there is a motor neuron that fires that is always correlated with my arm moving up; doesn't that just mean the free will occurs earlier in the process than the motor neuron firing?
kelseyfrog · 4 months ago
There are a lot of things I feel that end up not being "real," like embarrassment, a failure. and anxiety. Why should free will not be like any of those?
bananaflag · 4 months ago
Hm, but maybe you can predict the feeling before you can predict the action. Checkmate atheists :)

(for the record I am also a compatibilist)

keybored · 4 months ago
That it precedes voluntary acts tells us that most of what we do are not conscious. Which has been known for over a century, maybe millenia.

(opinion stolen from some Chomsky video)

criddell · 4 months ago
Well, what does freewill mean to scientists?
czl · 4 months ago
There is no single definition for all scientists. However if you define free will as choices that are completely free of deterministic or even statistically deterministic causes that science could in principle predict, then most scientists would say: no, that kind of free will probably doesn’t exist.
handedness · 4 months ago
> is it time to worry?

Shouldn't the device be the judge of that?

rpq · 4 months ago
I think the real danger lies in how many will accept that output as the unadulterated unmistakable truth for actions, for judgment. Talk about a sinister device.
smilebot · 4 months ago
You don’t need a sinister device. This is essentially how propaganda works.
analog8374 · 4 months ago
A handsome, well-dressed alpha speaking with confidence and certainty. That's truth right there.
rpq · 4 months ago
Propaganda is mostly without Science. This is with.
mostertoaster · 4 months ago
Ok does anyone else’s mind just immediately go to “The Minority Report” is soon going to no longer be just a sci fi dystopia?
absoluteunit1 · 4 months ago
For me, it immediately made me think of Psycho-Pass.

It’s a cyberpunk anime where society uses a system called the Sibyl System to constantly scan people’s mental states and “crime potential” (their Psycho-Pass).

People can be arrested before they’ve done anything - just because the system picked up certain signals from them.

Very, very interesting idea

mostertoaster · 4 months ago
Oh that sounds cool. Thanks for sharing. I’m definitely going to check it out.
thousand_nights · 4 months ago
yes, first thing i thought of. although i'm quite confident it's still outside the scope of our lifetimes, i do worry for future generations

Dead Comment

cma · 4 months ago
Rather than the Karpathy thing about in class essays for everything, maybe random selections of students will be asked to head to the school fMRI machine and be asked to remember the details of writing their essay homework away from school.
Lerc · 4 months ago
fMRI machines are not cheap, nor plentiful.

If, one day someone can make a small, cheap device that can do the job of a fMRI it would be more world changing than you can imagine. If you had easy access to realtime data about what is going on in your brain, there is evidence to suggest that you can learn to influence the data and literally change your own mind.

fy20 · 4 months ago
That's actually happening. Commercially you can buy a 0.55T system such as the Siemens Free Max for around $500k.

There are also developments in ultra-low-field fMRI (<0.1T) which use permanent magnets, which are estimated to retail in the five figure range, however it's more for structural usage (can identify a tumour or stroke progression).

What you are saying sounds like being able to control your own heart rate if you see it on a monitor. Maybe combining low resolution fMRI with models trained on higher resolution data, could give you enough visibility that you could learn how to activate other areas of the brain that you wouldn't normally use for tasks.