Readit News logoReadit News
baddox · 7 years ago
Oh dear. This isn’t my field at all, and maybe I’m just doing a bad job of understanding their point, but this sounds completely bogus.

Really...the brain doesn’t create representations of visual stimuli or store memories? Under what possible definitions of those words can this statement be sensical?

Surely the author believes that visual stimuli cause measurable changes in brain state, and that people can indeed remember past visual stimuli. Then how is it true that brains don’t create representations of visual stimuli and store and retrieve them? I’m at a loss here.

Perhaps the author means that the brain doesn’t do these things in the same way as digital electronic computers we’re familiar with. That’s certainly the case at the most basic level.

TheOtherHobbes · 7 years ago
I'm not sure I understand the piece either, but I think it's trying to say that human memories are associative, sequential, and distributed rather than localised.

So there isn't "a representation" in the discrete sense. It's more like the entire system changes, and it's impossible to physically scan specific elements of it to retrieve selected content.

You can trigger selective recall, but you're triggering a complex and noisy process which generates an experience that may include remembered elements - not pulling out a predictable bit pattern.

There isn't an exact equivalent in CS. Traditional binary memory is obviously nothing like human memory. Neural nets have some superficial similarities, but they lack generality.

I'm not completely convinced by the argument, but I'm glad someone is making it.

The problem with it is that we can remember specific discrete facts quite easily. If you ask me how many flats the key of F major has, I can tell you without being distracted by other memories.

What we don't know is how that fact is represented, how exactly my brain changed after I learned it, how similar those changes would be to changes in other brains learning the same fact, whether everyone has similar subjective experiences on recall, or how to scan someone's brain to check whether or not the fact is known.

behringer · 7 years ago
I think researchers are working on this very thing, and it seems like the brain does store "visual representations" or at least it can't be ruled out at this point.

https://www.theregister.co.uk/2013/08/20/mindreading_mri_spo...

And of course, why wouldn't the brain store a visual representation of what you remembered. That would be the easiest way to store and retrieve it, which is why we do that on computers as well.

jacobush · 7 years ago
It is very hard to introspect into what cannot be introspected... but with such things as how many flats the key of F major has (which I have a very vague understanding of what the question even means) ... I have a hunch that such data recall is incredibly crosschecked and error-corrected by correlating to many many other known facts. You may associate to maths lessons in elementary school, visual charts, piano lessons, math knowledge, that special smell of the paper and number theory (not necessarily fancy, just simple stuff) that altogether makes it impossible to corrupt upon recalling, how many flats the key of F major has.

While as for the kinds of memories we easily do corrupt upon reading (was the perpetrator tall or blonde?) there may be no such data to draw from. Or worse, generalised "data" which is good for pattern matching but not for actual recall. (The stuff prejudice is made of.)

If so, then there is no contradiction or problem, just a very intricate mesh of data.

(Also I have an idea which is not even a hunch, but pulled out of thin air, that our "logic" and "memory" are much more intermixed in our wetware than in computers.)

ChrisSD · 7 years ago
Surely retrieving an individual fact is more like a conditioned response than it is storing some bytes on a harddrive?

When I say "what is 9 times 9" your brain activates all the pathways (probably trained in childhood) that lead to you thinking "81" in a similar way to Pavlov's dog.

marcosdumay · 7 years ago
So, the brain has a distributed and complex architecture. Well, did anybody not know that already? Evolution has no love for engineering beauty.

If recall is possible, then there is a representation there. It's obvious that there is a representation of Beethoven's 5th Symphony on the brain of somebody that can play it. It's just a convoluted, distributed and encoded in some completely crazy signal space. Yet, if it wasn't there, the person wouldn't be able to play it.

traject · 7 years ago
The encoding assumption in neuroscience is also nicely argued against in this preprint. https://www.biorxiv.org/content/biorxiv/early/2018/07/13/168...
russdill · 7 years ago
Closest equivalent in CS would be lossy storage, Monte Carlo algorithms, or of course, neural networks.
skywhopper · 7 years ago
Yes “computer” here means “digital electronic computer” and yes the storage, retrieval, and processing all happens very differently in the brain than digital computers are capable of.

You may think these are dumb points for the author to make, but it’s not clear to me at all that the media or the VCs who buy the hype about machine learning, AI, and self-driving cars realize just how different they are.

naasking · 7 years ago
> Yes “computer” here means “digital electronic computer” and yes the storage, retrieval, and processing all happens very differently in the brain than digital computers are capable of.

No, it's incorrect to use "digital computer" here. More correct might be a von Neumann architecture computer. But then that shows that the author is attacking a strawman: people comparing brains to computers aren't limiting themselves to such an architecture.

lisper · 7 years ago
bkdbkd · 7 years ago
Totally with you here. Not a neuroscientist. But I am a computer scientist. Lots of strong opinions, not a lot of strong facts. The analogies are interesting but they break down quick.

"We don't create representations of visual stimuli"..."We don't retrieve information or images or word from memory registers" Neither do computers in many cases. It's as if the author is saying because in the brain isn't a tape recorder or film camera then it doesn't work like a computer. Nope. Studies show that much like a computer we encode information as we store it. Because we encode it fancy ( or weird :-) doesn't mean its not encoded or retrieved. The Dollar Bill example is a red herring. Just ppitballing here: What if, instead of creating an image for the dollar, the subject's mind created a visual'thumbnail' and 'hash' of the real dollar. The thumbnail she can recall easily and lets her draw 'enough' of the bill to be vaguely recognizable. The hash, lets her recognize the real deal whenever she sees it. She simply compares the hash of this new object with list of hashes, and if its a dollar the mind finds a match. Of course, it far more sophisticated, but these simple methods, cleanly account for what the author observed.

barberousse · 7 years ago
>Surely the author believes that visual stimuli cause measurable changes in brain state

They repeat throughout that it does

> and that people can indeed remember past visual stimuli.

They agree also

>Then how is it true that brains don’t create representations of visual stimuli and store and retrieve them?

>Perhaps the author means that the brain doesn’t do these things in the same way as digital electronic computers we’re familiar

The author is definitely in agreement with the second statement from what I understand. However, I think where people are getting lost is that they expect the author to resolve definitions for "cognitive"/information processing terms (because in our day and age, they are identical) when actually the author is rhetorically refusing them validity, and on purpose. Hence why in those places where the author is expected to supply an equivalently robust counter-definition they instead pose a far more general definition, such as, with respect to learning, "the brain changes". The name for this rhetorical strategy is "refusing the blackmail of the Enlightenment".

Deleted Comment

Deleted Comment

leftyted · 7 years ago
The dollar bill thing seems silly. The fact you can draw anything without looking at a dollar bill means something is being stored, right? That means the brain stores information. There's no way out of that. And that fact that you can draw the dollar bill on cue means something is being retrieved. No way around that either. It doesn't matter how the information is represented. The brain as a computer analogy doesn't specify that "neurons are bits" or whatever.

I don't expect the brain to work like any computer we've ever built (which seems to be the point of view this writer is attacking), but I do expect that it has the capacity to store, retrieve, and process information and so the computer analogy seems useful.

imh · 7 years ago
Yeah, you could make the same complaints about jpeg compression.
zzzeek · 7 years ago
Jpeg compression was my thought exactly. This author knows extremely little about computers.
mannykannot · 7 years ago
It is trivially true that the brain is not a digital electronic computer. You cannot, however, use that simple fact to show that the brain is not some sort of information-processing device, and as for the notion that brains do not store information, I wonder what he thinks memories are.

The author concludes by asking "Given this reality, why do so many scientists talk about our mental life as if we were computers?" He offers no support for the proposition that this view is common, and I suspect he is often taking, as literal, speech that was intended to be metaphorical.

seiferteric · 7 years ago
The author seems to have far to narrow an idea of what a computer is.
naasking · 7 years ago
This seems like the appropriate response to the article. In fact, the article is even factually wrong on the matter of what we're born with, and it's even wrong on what it considers "information".

I suspect this misunderstanding of "information" is the core of the confusion. He needs to revisit physics and learn some computer science, because information and physics are inextricably intertwined, so the brain very much operates on information using rules.

Edit: and further, the brain is a finite state automaton due to the Bekenstein Bound, a physics theorem.

nmstoker · 7 years ago
Yes, this is precisely what I thought as I read it.

I don't doubt his expertise on the brain side, but he characterises the computer in a very limited way, almost perfectly suited to make his argument.

With the reconstruction of the dollar there are plenty of examples how computers need not store entire instances of a thing to be able to recognise it later, such as applications of hash functions or doing spam filtering.

presscast · 7 years ago
In a former life I was a cognitive neuroscience researcher.

This reads like a piece written by someone who heard a neuroscientist take issue with the "brain as computer" metaphor, but didn't quite grasp what it was all about.

The "brain is not a computer" meme has to do with the fact that the brain does not process information in the same way as a digital computer. It is not saying that the brain is not a symbol-processing/computational system.

charleshmorse · 7 years ago
I think us commenters are all on the same page here :)

The author is almost making it seem like models are reality and that people think that. They're not and I don't think anyone has ever thought they were...

Further and like other comments already mentioned, the brain is thought of and treated as a turing machine, not a digital computer. It's done this way, because the brain can be mapped to the definition of a turing machine.

And I have to defend Von Neumann. In his book, he explored turing equivalencies between the brain and computer concepts at the time used to implement the digital turing machine, he didn't actually think that the brain was a one-to-one mapping to a digital computer... He knew the difference between models and reality.

Even for the history of models the author mentions (hydraulics, automata, etc.), these all contain some turing equivalencies if implemented correctly and they were simply using the language and examples at the time to express this.

The author also continues to mangle any and all ideas of modeling, abstraction, and equivalence throughout the whole article. With regard to his 'uniqueness problem', I mean 'information loss' is modeled digitally for a reason.. just because humans are lossy, doesn't mean we can't model them that way. Think of a compressed image file.

I don't think there's a single researcher worth their salt that thinks the 'IP Metaphor' is gospel. That is just a grossly unscientific idea to assume.

We're all free to choose any model or collection of models we wish to approximate reality, but some of them work better than others and the brain is a complicated thing to model.

The author is trying to dramatize a triviality.

barberousse · 7 years ago
>The author is almost making it seem like models are reality and that people think that. They're not and I don't think anyone has ever thought they were...

The author is arguing that when there is a "monopoly" of models with respect to a given domain, like the brain, that inexorably tends to make the conceptual distinction between model/reality irrelevant. The author then goes on to cite examples of this, not only with respect to our current age's infatuation with the IP model, but _previous_ ages own repetition of this with respect to their own guiding technological framework (hydraulic engineering and the humors, etc)

cuspy · 7 years ago
I think you're very wrong if you think all professional neuroscientists are disinterested witnesses or that they all give adequate philosophical distance to the IP metaphor. Its assumptions constrain or at least impact the landscape of valuation of all academic neuroscience research.

Academic science isn't an apolitical, free space. We are not all free to choose any model, and what it means for a model to "work better" comes down to evaluative criteria that are almost always baked into a particular set of theoretical assumptions.

tnzn · 7 years ago
>The author is almost making it seem like models are reality and that people think that. They're not and I don't think anyone has ever thought they were...

that's where you're wrong. Way too many people, many of them engineers, consider models to represent reality, and that's a real, big problem, because this is deeply linked to how they consider science.

sykic · 7 years ago
How can the brain be mapped to the definition of a Turning machine? It doesn’t have an equivalent to an infinite tape and it doens’t work accoriding to anything like the table of rules for a Turning machine. Can you point me to a reference for this claim?

Most of the comments I’ve read don’t like the article but almost all of the commenters I’ve read don’t seem to have studied this issue. It gives me the impression that these are visceral reactions. The article is not an article for experts. It’s expository in nature.

One thing that stood out for me was this quote:

The Future of the Brain (2005), a snapshot of the brain’s current state might also be meaningless unless we knew the entire life history of that brain’s owner – perhaps even about the social context in which he or she was raised.

If true this seems to me (very much a non-expert) to give serious doubt to the notion that the brain is a computer.

russdill · 7 years ago
The brain is a biological machine, it follows the laws of physics. There isn't any part of known physics that prevents us from simulating any number of atoms using a suitibly powerful computer.

A small aside, even simulating a small collection of quantum particles fully is enormously taxing with current computers and adding more particles increases complexity beyond just a linear increase. But this is a mathematical exercise.

Now, it's possible that the human brain depends on some law of physics that is not computable (possible to simulate on a computer), but given the level of study that had gone into neurons, along with the temperature of the brain vs the energy ranges we've examined with colliders, it seems super unlikely.

If it helps, Turing machines with n-dimensional tapes have been proven equivalent to the basic Turing machine.

mannykannot · 7 years ago
The 'infinite tape' issue is a red herring that sometimes appears in discussions about these issues. Real computers (such as the one I am typing this on) are informally described as 'Turing equivalent' because they can implement a Universal Turing Machine up to the limitation imposed by their finite memory. An alternative way of looking at it is that their model of computation, augmented with unbounded memory, could implement or simulate a Universal Turing Machine.

This is not the equivocation that it may appear to be, as it establishes a sort of asymptotic boundary between what is possible and what is not (the more memory we have, the closer we can get to it.) It also means, for example, that we don't have to wonder if there is one computer instruction set or architecture that can perform computations that are impossible by another (again, up to having sufficient memory to complete it.)

The author of the claim you are questioning has not, so far, returned to explain what he means, but I think he is saying that the brain is Turing-equivalent in the informal sense given above: we can compute like a Turing machine, up to the available tape/memory (though with a very limited tape, if we are not writing things down...)

If that is so then I (one of the people here criticizing the article) must say that I don't think it is relevant. An alternative interpretation of the statement, that it says it has been shown that that there is a Turing machine equivalent to the human brain, would seem to depend on believing (as I happen to) that the brain's functioning is a matter of electro-biochemistry that could, in principle, be simulated by a computer, but no-one, so far, has given a demonstration, or even a convincing explanation, of how that works at a Turing-machine level of abstraction.

With regard to the quote you offer: I think it is a simple case of rhetorical overreach -- one might need to know the entire history of that brain to fully understand everything there is to know about its current state, but that does not mean that, absent that full history, the state is meaningless. In understanding what a person is thinking, what they remember (which is an aspect of their brain's state) is more important than what actually happened.

marcosdumay · 7 years ago
> If true this seems to me (very much a non-expert) to give serious doubt to the notion that the brain is a computer.

Yeah. It also puts Quantum Mechanics into serious doubt...

So, I'll wait for better evidence than some random person on the internet thinking it feels right.

cuspy · 7 years ago
Models are not equivalent to the phenomena they describe.

Computational models are not an exception to this.

There is not even a single "part" or "function" of the brain that we fully, exhaustively understand through a computational explanation. All claims of certainty are premature.

What's really fascinating and really needs the attention of historians and anthropologists is why in this current historical moment so many STEM educated people who are otherwise very bright end up confused about this. Maybe the answer is obvious though.

your-nanny · 7 years ago
The author's notion of computer does not serve him well. It is too grounded in his experience of digital computing devices rather than an understanding of computing as a kind of process. Furthermore, the field of computational neuroscience is doing quite well, thank you. Tempral difference learning is both an algorithm and instantiated in brains in some form.