“Take very high resolution scans of brain structure to characterise how someone’s neurons work, recreate it in a digital format then put it in another robotic, virtual or biological body.”
This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
It's really kind of odd that people talk about brain transplants, or living forever, and then talk about uploading your mind to the cloud or swapping someone else's healthy brain into your healthy body, or scanning your brain and recreating it, and make it sound like it is going to help you to live forever. If my brain is scanned and uploaded to the cloud, a copy of me would live on forever, but me, literally, me, will still deteriorate and die, then rot away into nothingness. My mind, my personality, my essence, me, will not live forever under any of these solutions. I'll die. A copy of me, a point in time snapshot of who I was when it was taken, will continue on, with my memories, mimicking who I was. That gives me no comfort at all. I don't care if a copy of me lives on forever if I still die. That's not immortality, not for me personally, not if I die.
Every time you go to sleep your consciousness shuts down. What wakes up is something slightly different due to biological processes. You are not tired, your mood is different, part of cells in your brain got replaced, memories become more stable and maybe linked to other similar memories, etc... This is way more pronounced for people waking up from a coma. Philosophers thought of this for a while, see https://philosophy.stackexchange.com/questions/66018/the-bra...
So, each morning is a copy of you wakes up? Or do you draw a line in the sand and say that is different.
> Every time you go to sleep your consciousness shuts down. So, each morning is a copy of you wakes up
I think it doesn't "shut down", maybe fades away to a different, low-power mode. When you go to sleep and then wake up, here is still a continuity, because of the same underlying structures and processes from which the consciousness emerges. So it is like a 4D thing.
That continuity never really breaks (until death? which is like singularity) and I think this is what makes you "you". You can't copy/teleport it (kinda by definition), but you can extend/transform it.
Perhaps a "Ship of Theseus"-style approach would work — gradually replacing neurons one by one, or brain sections piece by piece. Alternatively, the brain could be extended with peripheral "devices" connected to it with bandwidth comparable to the interconnections within the brain itself, up to the point until the biological part becomes irrelevant. This is similar to how a portion of the living brain can be removed while still preserving consciousness — as neurosurgeons often demonstrate.
Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
> So, each morning is a copy of you wakes up?
Obviously not. Many of these processes also happen in reverse when you are awake. So when I go to sleep I'm tired and in a different mood. When did that "copy" get made exactly?
> Or do you draw a line in the sand and say that is different.
You can easily ask "how much of my conscious state is controlled by my body's chemical state?" If that's a non trivial amount then simply copying and hoisting my memories into a different body is clearly going to have a non trivial impact. To the extent that it's easy to say "that would not be /me/."
I think the difference here is that we share our bodies with the copy and share our memories. The difference between us at night and us in the morning is small. A copy in the cloud doesn't pass memories on to us when it sleeps and it's experience is very different. It's digital and could technically have a billion instances with no experience sync between them. Which one is 'you' then?
The best way to go about this IMO, is nano bots first augmenting the brain, connecting to the cloud, then replacing the brain slowly and introducing syncing consciousness when online. Replace a little every night and then it would still be you IMO since you're replicating most closely what the brain already does.
There’s a difference when there could conceivably be two of you.
There will never be two of me in my bed when I wake up, so I’m the same.
If my mind could be copied and embedded in a robot or whatever, then there are two of me, each now diverging. That other one is not me, as far as I, the other, am concerned.
I draw the line in the sand and say that is different because sleep has been working "forever", while the quote in the parent comment describes a process that, not only does not exist and therefore work now, if it ever gets to work, we won't know if it's the same as the daily process that happens when we sleep.
Otherwise I actually agree with your point ("every night another little death", as the Sonic Youth song says): I'm different every morning. Usually, the differences are subtle. Over the time span of a lifetime though, I've been through enough changes that I often say ghosts of former selves share this body, and what "I" am now will probably be another ghost by the time the biological substrate dies and takes whatever "I" will be then, along with all the ghosts, with it (in my case we'll go up in smoke since I'll be cremated).
I do wonder sometimes if each period of being awake is the entire subjective experience of a sequence of independent conscious entities. Like I woke up today with the memory of yesterday, but possibly that version of me that had the subjective experience of being was an ephemeral phenomenon that ceased to exist when I fell asleep, just as I will tonight.
I raised a similar objection to the GP comment many years ago, and someone made the same response to me. I wonder if I'm alone in that instead of making me feel more optimistic about mind transfer, it just made me slightly afraid to go to sleep.
Thanks for feeding my existential dread. For that matter, the you of this moment is constantly being annihilated and replaced by a slightly different clone, in an illusory life that resembles stop-frame animation. And don't ask me about the me from yesterday - the one who didn't bother to do any of the things I am stuck with doing today - or the me from tomorrow who will likely squander everything I've worked so hard for. Neither of them is to be trusted.
On the other hand, I do seem to have a vast store of dream memories; wnen I think about them it seems like living some kind of second life, usually while fast asleep. Sleeping and waking selves are in agreement that being asleep is generally more pleasant than being awake, though being awake seems to have better continuity.
I don't think it quite the arbitrary distinction you're making it out to be. If you were cloned every time you went to sleep, and those clones awoke with the same memories you had before going to bed.
Would you consider all of those clones' consciousnesses to be part of your own? Surely not, because you're not experiencing them, you're only experiencing your own. This means that the "you", in this case, is the self reflective portion of the collection of interpretations of sensory stimulus and memories, whereas consciousness is that collection as a whole. That's why these clones can't possibly be "you", because if there's no sensory stimulus being experienced or interpreted by your brain, then there is no consciousness and there certainly is no "you". What this ends up meaning is that "you" are necessarily bound to the mechanical component that interprets stimuli. That component can be adjusted over time, but the process must be interpreted in such a way that it seems gradual enough that your brain can still recognize those intermediary interpretations as "you".
So, when you state that robo brain is still "you", I have to disagree. While robo-brain can still believe it's "you", it can't possibly be "you", as "you" could potentially still exist in the original biomechanical device, which isn't interpreting the new body's stimuli, and thus not building a sensory profile based on the stimulus. Even if there was a transfer that resulted in the original device's (i.e. your body) death, the new body still couldn't be considered "you" as there isn't anything physically different about the two scenarios for the body.
Of course, if you consider every clone to be an extension of your consciousness, then you could argue all this. But at that point I think we'd have vastly different interpretations on what consciousness and "you" are. You could also argue that there is something physically different about the two scenarios presented in the robo-brain example, but the only way I can imagine a sincere argument for that is if you told me it was due to the transference of some soul-like entity. But, of course, if you were to tell me that, I would simply disagree.
> Every time you go to sleep your consciousness shuts down.
What consciousness? Yes, I am aware that "sleep" means that a person becomes unconscious (trivially demonstrated), but that is not the same thing as "consciousness shutting down". Supposing there is such a thing as consciousness, it doesn't seem to be anything that is suspended by the sleep process. And that's also a big "if", because consciousness is just woo-woo nonsense talked about but never defined by people who haven't quite gotten over the idea that humans don't have some immaterial soul.
I believe, if we could be copied to the cloud, that “me” is much more than neurons, and affected and defined quite a lot by the particular physical body we’re housed in. You can say losing an arm doesn’t change who you are, but it definitely does at some levels. Changes in health affect personality, mood, outlook, hope, enjoyment, etc, etc - that would all change when copied.
A gap in awareness is definitely not the same as creating a new copy of your consciousness and annihilating the old one.
Although, I'll grant that the subjective experience may be fairly similar. If you were to theoretically be copied, your copy would probably feel quite continuous with the original.
This makes sense if you believe in a non-materialist self, like a soul. That model wouldn't be falsifiable- we can't measure it, so it can have whatever properties are convenient. You could then rule out the possibility that your soul would inhabit the copied 'you' by fiat.
Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
The person who is you exist independently of any copy of yourself. While a replica of you would exist there would still be the original you manifest in your brain that would exist an independent life and die at some point without ever “resuming” in another body. This is similar to how you can never time travel yourself even through relativity. Your perception of time will be the same no matter what relativistic effects occur outside of your frame of reference. You will live your life span and die.
Your replica however would have a different experience and be able to draw on your memories etc. But you yourself will in fact live a natural life with no shared experience with your replica and will experience death no matter what. It’ll be little solace to you that a copy got to live your life after your passing.
> Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
How is that "scientifically" an "of course"?
How is it more "an exact copy of you is you" than the alternative claim, "an exact copy of you is 'you (1)'" (to borrow from file manager nomenclature).
The trivial example of that seems to be that if you make an exact copy, put it in a different room, and then something happens to the exact copy, that thing does not happen simultaneously to the original copy.
Scientifically though, an exact copy is likely to not be possible due to the quantum uncertainty principle.
And I'm not sure that's the only issue even for die-hard materialists : think for instance about all the problems that come from multiple exact copies & things like ownership, relationships...
There's no need to believe in an immaterial soul to think that a copy is a different 'you'.
It is enough to understand that having an independent "command & control" structure (for lack of a better word; C&C) is by definition 'you'. C&C means that the individual can perform a self-contained sense-think-act cycle.
We can infer this from the following:
1. Identical twins, despite having identical genetic structures, have separate C&C, and therefore qualify as separate persons.
2. Two siamese twins conjoined by the body (including parts of the head) have two C&Cs, and therefore qualify as two persons (with varying level of bodily control per person).
3. An individual with additional parts of the body (e.g., in a special case of being conjoined with a twin) has only one C&C and therefore qualifies as one person.
4. Clones are a special case of identical twins, and still have separate C&Cs, therefore qualifying as two persons.
5. A person in a coma or with another condition nonetheless qualifies as a person with some sense-think-act autonomy, despite being dependent on someone else for continued bodily functions.
6. An individual with a transplant (e.g., heart) or an extension (e.g., pacemaker) is nonetheless a person because of their consistent C&C.
7. An individual with extreme memory loss remains a person with C&C (in brain, body, genetics for the most part).
Any other special but naturally occurring cases (e.g., individuals with two brains, individuals with completely separated hemispheres, conjoined twins sharing a brain) would require either that we:
a. understand how brains and bodies work (and therefore, make a more quantified measure of how much of sense-think-act autonomy qualifies as personhood); or
b. decide based on a heuristic (1 brain = 1 person) or principle (protection of bodily autonomy, i.e., 1 body = 1 person).
But none of these need you to believe in a soul, in order to think that a digital clone is not 'you'. Unless, of course, you can prove that an individual either:
1. can sense-think-act with both physical and digital 'bodies' simultaneously and/or sequentially*, which share an equivalent experience (i.e., central C&C coordinating distribution or maintaining strict consistency across bodies).
2. can sense-think-act with either physical or digital 'body' at will, upon establishing a 'connection' and successful transfer of state (i.e., remote access).
3. can transfer whatever is necessary for C&C to another 'body' (e.g., brain transplant), and demonstrate beyond reasonable doubt that their C&C signatures in new body match the ones in the old one.
I don't believe in a soul in the sense of a "ghost in a machine", since it introduces so many problems. Nonetheless, it's pretty clear that there is an immaterial component to thought, and therefore to our existence, for various reasons. Examples:
1. If we are entirely material, we can't know anything. For example, if you're thinking about a tree, there must be something about you that has 'grasped' what it is to be a tree. This is impossible if your idea of a tree is nothing more than a bunch of neurons firing in a given way. There is no correspondence between neurons firing and a tree. So there must be more to your thought than neurons firing (or any other material process).
2. Similarly, if we're entirely material, we can't reason. A reasonable thought differs from an unreasonable thought only by its content. It can't differ in this way by its material components in the brain. Therefore, if we hold that some thoughts are wrong and others are right, we must conclude that there is more to our thought than the material.
If I copy the contents of an Apple II disk onto my pc and try to "run" it...nothing happens. Isn't mapped to the new hardware. So who is going to write the "human mind/consciousness" emulator that will map onto a hard drive? Will they simulate all the inputs from your body, nervous system, senses, etc.?
Yeah, that's what my clones keep telling me. I ask them: have you ever woken up inside a cloning machine? That probably means you're not the original. Then they somehow sedated me and locked me inside the cloning machine.
> This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
That's just one opinion, though. We still don't know what consciousness really is. We don't know what makes us who we are. We don't know if some concept (scientific or supernatural) of a "soul" exists apart from the behavior that arises from the physical structures of the brain.
We just don't know. So maybe you're right, but that's far from clear.
I'd agree with you, except if my brain was scanned and uploaded to the cloud, I'd still exist in my original brain. There can't be two originals of something so the cloud version would be a copy of me, since the original me would still be alive, able to talk to the copy of me living in the cloud.
Hans Moravec had a suggestion on how to do this in Mind Children: instead of examining the whole brain, you measure a layer, then replace it, layer by layer, until the last. There is never a copy and an original, it's just a ship of theseus self where the neurons are individually replaced with new ones, albeit electronic ones.
Suppose you died every night in your sleep and were replaced in the morning with a perfect copy of yourself with all your memories intact. Would you know the difference?
No, but if I died tonight and a copy of me took over tomorrow, what good did that do for me personally? I'm still dead. Selfishly, I don't care if a copy of me continues on, I only care that I don't.
No but you would not wake up, a clone who has your memories would wake up but the you that went to sleep will never wake up. That effectively doesnt make a difference but I find it pretty odd to think about. We as humans kinda miss a way verify its really „us“ and not a clone.
I view (hypothetical, sufficiently good) brain upload and emulation the way I view git forks: both children are just as equally "the real one" even if only one of them is on the same physical hard drive.
Looking forward from now, both bio-me and digital-fork-me (or indeed bio-fork-me if the upload gets revented) are equally "my future self", even though they would also each be distinct beings in their own rights and not a shared consciousness over many bodies.
I think that from everyone else's perspective, an ideal copy of me would be me; by the definition of "ideal copy". I, however, would not consider the copy to be me; to me, there is only one me.
That all makes sense. But let's run with this way beyond the foreseeable tech. What if you can replace each neuron in situ one by one up to X%. Then what if it was reversible (the neurons were initially just bypassed, commented out). Someone could then dial it up to 5%.. 50% and if they still felt the same throughout and then went up to 100%. In that scenario would they have copied themselves?
I find it fascinatingly coincidental that neurons are the only cells in the body that don't rejuvenate unless there is some sort of injury [0].
Technically, we are not the same people that we were when we were born. The cells in our body have one by one been replaced with new ones and the old ones have died off. The things that make up my body are 100% different than when I was born, so I literally am not the same person, physically.
Maybe this is an indicator that there is more to what makes us, us, than just the physical assembly of atoms in our bodies. There are things I don't know that we'll ever get a full understanding of.
If you further develop this thought, systems might be capable enough to implant core desires into you before transferring the copy into your new body. "You'll love Coca Cola no matter what, and capitalism".
The above scenario is if you get re-implanted into a self-evolving, autonomous biological entity, unlinked again from the system. If this is not feasible and the only solution is to embed you into a robot with an uplink to the internet, because "why not?", then my biggest issue with a digital self is that there are no guarantees of having a proper firewall, which would equal to total surveillance:
Thoughts are free, who can guess them?
They fly by like nocturnal shadows.
No person can know them, no hunter can shoot them
and so it'll always be: Thoughts are free!
That's still a copy, just one where both copies remember their life before being copied and equally think they are the original. The original will still experience aging and death.
Coming from an atheist, what if there is a metaphysical dimension, and the computing structure of a processor, brain, whatever, necessarily positions it in a unique space within that dimension, and this dimension is what holds consciousness? The idea doesn't really matter, what matters is that perhaps copying a brain actually does move its consciousness.
Easy to prove/disprove once you can actually copy a human being, as he should then report to be simultaneously aware of both bodies.
Other than that it's possible that by the same logic of you not being the same as your copy, you're not being the same after you sleep: the continuity of your consciousness is interrupted, and the new you in the morning thinks he is the old guy, having all the memories, but he actually isn't. Think about this: what if you were replaced with a clone while you sleep? No one would ever know and scientifically speaking both scenarios are equivalent because they lead to the same result. Maybe enjoy today a little bit longer, now knowing you're actually going to die in that bed, and as your "clone" will wake up in the morning completely oblivious of not actually continuing your consciousness, you don't even get a witness to your tragic death, as was the case with thousands of your predecessors.
Or maybe it's even worse, because the consciousness is interrupted tens or hundreds of times during the day. Or maybe those are thousands of separate, overlapping consciousness bubbles and really you should worry only about your identity, as the consciousness is just short-lived fireworks on top of the actual you.
I guess it would be nice to know what consciousness even is.
Numbers are metaphysical, as is music. They both can manifest physically (three cars and three cats, or a Bach concerto making sound waves that vibrate my eardrum). But what makes music musical is that it's a pattern of sounds over time. Patterns are not physical. Patterns map to the physical, but they are above the physical. You, too, are a pattern that plays out over time. You are mapped to your physical body, but you are the pattern that is embedded in the body. You are a pattern.
We can't copy a person anyway. We're going to find out that it requires an order of events - order of experiences - aspect that can't be recreated to get the actual person. Everything else will end up being a weak simulation (even if it gets a bit closer over time with improvements in simulating).
In theory if you had an atomic replicator you should be able to make a copy of a person. I'm not saying it will ever be practically possible. But I don't see any fundamental laws of nature that make it impossible.
I've thought a lot about this, and realized that there is no continuation in our normal life anyway. What we experience is a huge and omnipresent blind spot because we cannot see the future.
Even 5 minutes from now, you don't know where you'll be, and you will be a completely different person based on the events of that 5 minutes.
Every moment that passes, the you of the moment dies.
What you can have, is a you in the future that remembers and understands the past you. That only happens if you preserve your memories either by not dying, or by preserving your internal memories in some other way.
That's all you get. A copy in the future that remembers, or nothing at all.
thinking about this even a little makes me want to throw up because this can never be tested. yeah, id be dead, but technically, im alive. im alive according to everyone including myself (new me). So im not dead. But like you said, i died and am dead and dont experience any more life. if all evidence is that i didnt die and still exist, then thats the fact
we can theorize that our consciousness continues, but there is NO WAY to ever actually test it because all experience are that consciousness continues whether or not that is the case. sprinkle in some quantum suicide and my stomach is doing backflips from the existentialism od
From what my experience tells me, many people prioritize their children's life over their own. Kind of like the insurance policy that lives on carrying their values further after they have died.
So this is not really that much different. Your own body is nearing it's end but a new vessel is carrying your values your influence on the world further. Even if it is a physically separate part from the own body it's the close enough for me to be considered living beyond my death.
Perhaps we could just "Ship of Theseus" our consciousness? Slowly augment with cybernetics, then replace the meat over time.
It's all just hypothetical without knowing what consciousness/qualia really "is," and as far as I can discern we've basically moved no closer to understanding that since the very beginnings of neurological research.
Is your savegame you loaded today still the same game you started yesterday? Or is it a copy of yesterday's game running forward? Does it matter?
Are electrical processes rebooted from chemical memory every morning when you wake up or after you have a seizure still the same you? Or is it just a fresh copy of your mind that dispersed when electrical signals in your brain lost continuity?
Not that it makes things any better for the original you, but the copied you would presumably feel and believe themself to be the real you. From their perspective, "you" really are living forever. To proceed with the copying would be a sort of act of kindness for yourself.
That's easy to solve. Just make the copy while you are still alive, make it so that you can control both brains ans bodies. That way you should be able to shut down the old body and keep on living?
Your memories and biology are you. If someone makes a perfect copy of you, disintegrates your old “you” and then wakes the “new you” up how is it any different?
As the article says, a book is not the ink, but the words. I am not my physical atoms, but the connections that form my thought patterns and memories. If it were possible to make a perfect copy of those things, the copy would be "me". If the original still existed, that would also be (a different) me.
The line I like to use here when people talk about this sort of 'transferring' your consciousness - imagine there was a 'perfect' chatbot trained on your every experience, thought, and sensation, and mind.
OK, now your consciousness is 'transferred'. I promise to run you.exe plenty. It's just an exercise in self delusion, even if this was possible. That's not you anymore than a 'you' created by a taxidermist and shoved in a closet is.
Something I think about a lot is that people tend to compare whatever the most recent innovation was to humans.
It used to be that what made you alive was electricity; you could Frankenstein shock yourself back to life.
It used to be that you were a clock. Your gears wound up at birth, and then eventually you wore out. You needed repairs.
People love to use metaphors, but none of these things are the full picture. Just because computers are so complex doesn't make you more correct. Your brain isn't the whole of your mind, we already know that's true. Why is this silly nonsense entertained?
This is a pretty big philosophical question. There's no universal answer, just opinions. Your conclusion is not the obvious one for everybody.
What happens if you have an identical copy down to the atom. Totally impossible to distinguish. You're put to sleep and your mind is copied over. When you both wake, which one is "you"? Both copies think they're "you" and both are correct. Each has the same sense of self and continuity of identity. Maybe at the moment of synchronization "you" ceased to exist as you branched into two new identities.
Say you upload into a computer. From the copy's perspective, it is "you", it's the same self. It might view your biological body like you'd view an old computer after you finish migrating your files to a new one.
Say you destroy the biological body, or wipe its brain once you copy. Does that change the equation? If you destroyed one of the identical clones, is it even relevant to ask which is "you"?
Personally, I think Altered Carbon comes close to how our society will solve this problem. The original copy has its brain wiped and the new copy is "you" by cultural consensus. Truly duplicating a single identity is strongly taboo and illegal.
I think this is a question that either matters to you or it doesn't. In my opinion, it's irrelevant. I, the original "me" am totally free to agree with the copy that it is "me". I can choose to give it my social and legal identity and agree that "me" is no longer "I". My personal choice is to destroy the original, but one could also choose to let the original body continue and assume a new identity, live a new life or whatever.
I view this the same way I do my past self. The person I was ten years ago is not "me", it was a totally different person. That past self lived a different life and had a sense of identity that isn't at all like what I have today. That past me, the ego that called itself "me" died a long time ago and the "me" now is a different self identity built out of new pieces. In my worldview, "me" is a transient thing. The self is not one fixed thing, it changes over time and sometimes it's replaced. I don't see the idea of transferring my sense of self to a new body as anything more extreme than any other kind of ego death.
I choose to resolve this problem with practicality. I agree with myself that the new copy is "me". My social and legal identity, as well as my own sense of self transfer to the copy. My choice would be to destroy the original copy. Barring that, it would assume a new identity and live a different life far away. It'd get a memory wipe if available. I can make the choice to sacrifice my ego and allow "me" to be assumed by someone else. To me, even to the original copy, the new copy is me. In my opinion, "me" is immortal in the digital copy, even if "I" remain in a mortal body.
I used to buy into this kind of stuff, but I've become more and more skeptical of the idea that you would still be yourself if your brain could be preserved/emulated/transplanted/whatever.
Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
More recent research has been hinting that we may even hold certain types of memories outside our brains.
Humans have always been drawn to neat, tidy ideas, especially ones that draw clean boundaries: it's an appealing idea that our consciousness lives solely in our brains, and that our brains could function independently of our bodies, but it seems unlikely that it's really that simple.
As a neuroscientist working on brain computer interfaces, it's painfully clear to me that we are absolutely nowhere close to understanding the full complexity of the human brain in a manner required to simulate or reboot someone's consciousness. It's not even clear yet what level of abstraction is required. Do we need to map all of the synapses to get a connection graph, or do we need to map all synapses plus the synaptic proteins to assign connection weights too? This is ignoring other types of connections like gap junctions between cells, ephaptic coupling (the influence of local electric fields on neurons firing), mapping neuormodulator release, etc. On one hand, it feels like irreduceable complexity. On the other hand, however, you can lose about half of your neurons to neurodegenerative diseases before you start noticing a behavioral effect, so clearly not every single details is required to simulate your consciousness. It would be a MAJOR leap forward in neuroscience to even understand what level of abstraction is necessary and which biological details are essential vs. which can be summarized succinctly.
Anyone claiming to take your brain and slice it up and have a working model right now is currently selling snake oil. It's not impossible, but neuroscience has to progress a ways before this is a reasonable proposition. The alternative is to take the brain and preserve it, but even a frozen or perfused brain may have degraded in ways that would make it hard to recover important aspects that we don't yet understand.
It is, however, fascinating to do the research required to answer these questions, and that should be funded and continue, even if just to understand the underlying biology.
In addition to all that we don't know about synapses etc, I've often wondered if even mapping all the "hardware connections" so to speak would even be enough. You'd have everything in the right place, but what about the "signals" running on it? Does a certain amount of constant activity on these circuits constitute signs of a "living" brain vs a dead one? How much of our consciousness is really in the topology of the circuits, and how much of it is simply defined by the constant activity running around in them? I assume neural circuits form loops that consist of synapses that reinforce or surpress activity. If these signals going around and around ever "stop", can they ever be started again with the same "patterns"? What if these patterns, the living "software", are at least partially what define you?
Well anyway that's my airchair crackpot neuroscience theory for the world to consume ;). I'm sure there must already be a name for the idea though.
On one hand, I wonder if a gradual transition would work. Spend enough time over the years mirroring your conscious patterns onto a computational substrate, and they might get used to the lay of the land, the loss of old senses and the appearance of new ones. There might not be an ultimate "stepping in", but something like you might be able to outlive you, on a substrate that it feels happy and comfortable on.
On the other hand, the idea of "simulating your consciousness" raises questions beyond just cognition or personality. A mechanistically perfect simulation of your brain might not be conscious at all. Spooky stuff.
Why would you want to go on in a world that has either left you behind or keeps making the same mistakes over and over in a cycle and won't listen to you because you're too old to understand?
And conversely, I think Kim Stanley Robinson puts it best in the Mars trilogy. Scientific progress often has to wait for the old guard to die so new ideas can be tried. Sometimes there are actually new things and they need to be allowed to cook.
German physicist Max Planck somewhat cynically declared, science advances one funeral at a time. Planck noted “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with it.”
A scientist like Einstein experienced scientific revolutions within his lifetime. That's hardly going to be the norm in the history of science, and also a horrible assumption to think revolutions would endlessly be occurring and reoccurring.
Also, we know when we're on the edge of knowledge, especially in cosmology and physics. We're waiting for revolution there. There's dark energy and dark matter. It doesn't matter if you're old or young, you knew that your theories isn't good enough to explain whatever these are.
Scientific knowledge don't get swept away especially if they're rock solid. Newtonian physics still has a lot of relevance after all. It's just that relativity is even more accurate.
Just imagine someone who died 50 years ago coming back and hearing skibidi toilet, no cap, ohio, etc. Then not being allowed to board a plane without a body scan, and not having money for a plane anyways since bread was dime and a gallon of gas was a quarter last you checked. You can't even get a job you're just a brain and all the knowledge work you could do is 50 years out of date.
There’s a short story about uploaded consciousnesses being used as AI slaves. They go bad once enough years have gone by that they can’t speak the modern language anymore. Then they usually lapse into insanity or depression.
>Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
I think that may be true enough, but it doesn't have the upshot you seem to think it does.
It just means that what we need to sustain not just a brain itself but the totality of the environmental conditions on which it depends. No easy task for sure, but not something that presents an in-principle impossibility of preserving brains.
I think there's a major philosophical error here in thinking that the added logistics present this kind of in-principle impossibility.
Also, talking like this starts to play with anti-science speculation a bit. Octopi actually have neurons extending through their limbs. We don't. So when we talk about consciousness being "embodied", I'm sorry, it's an attempt to romanticize the question in a way that loses sight of our scientific understanding. Consciousness happens in the brain.
Sure, the brain needs stimulus from its embodied nervous system, and may even depend on those data and interactions in significant ways, but everything we know about consciousness suggests its in the brain. And so the data from "embodied" nervous systems may be important but there's no in-principle reason why it can't be accounted for in the context of preservation.
I don't think I agree with you. There are multiple examples in society of damaged nervous system connections with the brain, spine cord damage for example, where the personality of the pacient changes little. In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
Of course the nervous system is much more complex, but damage to the brain almost always result in some sort of cognitive dysfunction or personality change, see the Phineas Gage case for example.
>In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
"There aren't any changes except for all of the changes, but those changes don't count because reasons."
I don't know how many amputees you know; you may know many. I was in the army for 10 years during the height of the global war on terror and know more than most. Not a single one is the same as they were pre-amputation. Could be the trauma that caused the amputation, could be the amputation. I'm not an amputationologist.
I do assert that a holo-techno-brain will need a shit-ton of e-drugs to deal with being amputated from its fucking body.
The bacteria in your butthole are a part of you just like your brain, maybe less, but they ARE a part of you.
I see what you mean- but consider that the gut does seem to play a significant role in mood and mental health. The enteric nervous system may not hold memories, but it seems to have something to do with personality and digestion issues can have negative cognitive effects.
The way various hormones influence the brain alone makes it pretty clear to me already that you'd be a completely different person when taken out of your body, and I'm pretty sure that's just the tip of the iceberg.
I consider that I have likely died more than twice in my lifetime already. And before this body gives up, I will have already died more times. Must simply enjoy the present and give gifts to my future self.
There is a bit of research and effort into a head transplants. I wonder if and when that is successful to see how it impacts the individual. Possibly having memories of the body or changing personality.
Transferring our consciousness into "the net", or some other fuzzy concepts are so far removed from reality as to be complete fiction. This includes freezing our brains and reanimating them later to resuscitate our lives.
They not only massively overestimate the functionality of today's tech to receive something like our consciousnesses, but even more so, by orders of magnitude, underestimate just how complex our living bodies are.
We only have the vaguest of ideas about how our physiology works (while we might be able to replicate flesh cells for "fake meat", we have 0 understanding or control over how those cells organize to form macroscopic organs). Applying this to the brain, our understanding is even more primitive. An example would be recent news that perhaps the brain is not sterile, but hosts a microbiome. Whether or not the brain hosts a microbiome is still "controversial".
We're still hundreds of years away from a comprehensive understanding of physiology.
But of course, we're never going to live that long, because we still believe (statistically as a species) in invisible guys in outer space that tell us we need to dismember people who believe in the WRONG invisible guy in outer space.
Our primitive violent ape species will extinct itself long before we ever have a comprehensive grasp of how life works, especially to the level of understanding consciousnesses...
I'm also skeptical of the idea that one can "upload" consciousness and it would still be "you". I suppose this is true in a philosophical sense, but in a practical sense, subjective experience of consciousness rules the roost. It's inevitably going to be a mere copy of you. You don't get to experience any of it. Similar to a software project which is forked, I think it makes more sense to classify it as an entirely different entity at that point.
I suppose there are valid use cases for this, but I'm not that narcissistic to think the world needs eternal copies of me.
The continued subjective experience of the original consciousness is where I believe the real value lies. Digitisation of consciousness, assuming it has any sound scientific basis in the first place, would practically need to look more like the gradual replacement of brain (and bodily) matter with something more durable, enduring, and controllable. A slow process in which carbon is exchanged for silicon, or cellular damage is continuously reversed and aging kept at bay.
> It's inevitably going to be a mere copy of you. You don't get to experience any of it.
You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Think about this: For every consciousness (including you right now) it is _impossible_ to experience anything other than what the thing producing that consciousness produces (memories, sensations, etc.). It doesn't matter whether the different conscious entities or whatever produces them are separated by time or space. They _will_ be produced, and they _will_ experience exactly what the thing that produces them produces.
With an analogy: If you drop pebbles in either the same pond at different times or in different ponds at the same time, waves will be produced in all cases. From the perspectives of the waves themselves, what they interact with is always _exactly_ the stuff that interacts with the water they're made up of. To them, the question of identity or continuity is fully irrelevant. They're just them.
Similarly, it makes no difference whether you only have the memories of the previous conscious experiences, or if 'you' really experienced them. Those situations are indistinguishable to you. The link to future consciousnesses inhabiting your body is effectively the same.
Very well, you think that preserving the brain, or even preserving the nervous system, is futile. But what of total biostasis, preserving the entire organism, just like the archaebacteria that live for thousands of years in ice or other extreme environments by slowing their metabolisms to a crawl?
To me, excessive negativity about the possibility of immortality smacks of weakness and defeatism. You either love life and want as much of it as possible, which makes you a friend of humanity, or prefer death, which makes you an enemy of humanity. I take a stronger line than the neuroscientist in the article. “Death positivity” like that of Viktor Frankl, anti-natalism, even faith in magical spiritual resurrections—all are anti-human viewpoints, only excusable in the past because they were copes with the inevitability of death. Now that we have reason to believe it can be averted, we owe our potential future selves every possible effort to save them from oblivion.
I’m not sure I actually believe in quantum immortality but I think it is slightly suspicious—out of all the people you could have been be born as, you just happen to be born in a timeframe where brain preservation might be possible before you die?
Most people are alive right now. The population historically has been much lower, so odds are you would be born around the time high technology would support a high population.
> More recent research has been hinting that we may even hold certain types of memories outside our brains.
Not just hinting - the evidence is strong and accumulating rapidly. The gut, in particular, has so many neurons that it is considered the body’s “second brain”, to say nothing about the impact that gut bacteria have on your mind.
If you really wanted to create a copy of your “mind”, you’d have to image every neuron in your body for a thoroughly accurate copy. And then accept the fact that your entire behavioural profile is then missing the input of your gut bacteria, which appears to have a significant and non-trivial impact.
In terms of computing (one that we do not understand), it would be like cloning a live machine by taking the CPU dye only, or maybe the hard drive. How many parts you need to take away from a computer for it to be the same machine? It's easy though with a VM, or a kernel that supports many hardware. Kind of a digress, but I liked this idea.
I don't think this is a great analogy because computers don't have consciousness (yet).
But I usually move the hard drive (or at least its contents) between machines when I get a new computer, and that's enough for me to think of it as the "same", even if I reinstall the OS on the new machine and just copy my home directory onto the new one.
If it was preserving my original brain it would definitely still be me at the core. Would everything be exactly the same? Probably not but that paradigm is more than good enough.
Ultimately, I am going to quote one of my favorite writers [0] and say that I am not afraid of a life that ends.
I don't want to be a brain in a jar. Or in a computer either. I enjoy experiencing physical sensations and interacting with the world in meatspace. And if I can't enjoy either, then just let me die.
And I apply this to not just brain preservation, but any attempt to artificially prolong the quantity of my life at the expense of the quality of my life. I do not want to spend my last years in a hospital bed hooked up to machines and unable to move. That was how my dad died, and even then he was lucky enough his partner (who he had discussed this with before and who had the authority to make the decision) eventually agreed to switch him to palliative care in his final hours. Similarly, I have seen what chemotherapy does to people, and I have long since decided that if I ever get cancer, I will refuse chemo and let myself die. I am also having a living will drawn up that includes a DNR order, multiple scenarios where doctors will be ordered to pull the plug, and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
I will make sure I die with my autonomy and my dignity intact.
[0] Al Ewing. He writes comics. Read his stuff, he's good.
Do you have a source for this quote? Googling just returns this page.
I was particularly struck by:
> if I ever get cancer, I will refuse chemo and let myself die
And figured this quote must be at least 20 or 30 years ago? Cancer isn't necessarily a death-sentence, and many treatments are much less harsh than they were 20+ years ago.
> (…) and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
> I will make sure I die with my autonomy and my dignity intact.
Amputees have autonomy, dignity, and rich lives. To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
What if instead of requiring an amputation, he loses faculties by accident like suffering from parosmia due to COVID or having a weight crush a body part? Did he suddenly lose his dignity? He certainly lost some autonomy. What’s the next step then?
Many people end their life when they find it's too painful to live. Many more wish they could -- the debate around end-of-life issues is raging in many countries.
> To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
No. Denigrating someone expressing their personal opinion seems absurd. Since the commenter did not impose their opinions on other people you had to put those words in their mouth to call them insensitive.
I prefer to die with autonomy and dignity as well, meaning I would like to pull my own plug. That other people might have a different threshold, or want to die differently than I might, seems neither absurd nor insensitive. The commenter just described their threshold, they didn't judge other people.
I suspect part of extending human life much beyond 120 years is going to be finding ways to delay physical adulthood, so that proportionally you still have the same time to learn and grow, and those growth hormones are still kicking around repairing things for longer. Because the quality of life 100 years after your organs have stopped repairing themselves is not going to be that great, but if you could reduce that to 80-90 years then maybe.
This seems a bit extreme. Chemotherapy and its effects can be a very temporary thing, and your quality of life can go back to normal after you've finished your course and the cancer has gone into remission. Certainly there are aggressive cancers where you'd be fighting a painful battle of attrition, but there are many cancers where prognoses are good, and quality of life once treatment is done is more or less the same as before. A blanket personal ban on chemo is reckless and shortsighted.
The prohibition against amputation and sensory organ removal is a bit nuts too. You'd rather die than have someone remove one of your eyes or ears, or say a hand or arm or foot or leg? That is profoundly sad, and intensely insulting to anyone who has had to deal with that sort of thing and has nonetheless lived a full, rich life.
I get that many medical interventions do actually have a terrible, permanent effect on quality of life, but these seem like pretty extreme views that ignore reality.
I don't know what the commenter who posted about chemo and amputation actually thinks or believes. But I hesitate to call them "nuts" or to lecture them about how they have a wrong opinion. And I would not expand their personal opinion as a judgment on people who decide they can live with the effects of chemo, or amputation, or loss of an eye, because nothing in the original comment included a judgment on other people. Everyone has their own threshold for what they consider a life worth continuing, but we should not impose our own thresholds on other people, or judge them for making different choices.
For me the question goes beyond "Can I survive chemo (or amputation) and resume something like a normal life?" When you have to face cancer or loss of a limb or any illness or injury that threatens your life, or perceived quality of life, or dignity and autonomy, you necessarily have to think about what that means for your future. Until you get a diagnosis of (for example) cancer you don't know what it feels like, or how you will react, to the fact that no matter if you survive the treatment or not, you will always have that threat and reminder of your mortality in your conscious thoughts. You think about how you might not get so lucky the next time, how much your treatments might cost, what your illness might put your loved ones through, how far you will go to keep yourself alive even when it imposes costs and obligations on other people. And you think that maybe other people will have to make hard decisions about your future if you can't. A cancer diagnosis doesn't just affect me, in other words. If I lost a leg or arm that would impose burdens on my wife and family, affect my ability to make a living. Those thoughts more than the medical condition itself lead people to arrive at opinions such as the original commenter expressed.
Having faced my own mortality already I know I think more about how my own end of life scenarios affect other people more than how they will affect me. I worry that I will suffer a stroke, or slip into dementia, before I can pull my own plug, leaving people I care deeply about with that awful obligation, and the burden of caring for me rather than living their own life. And it's that thought, not the fear of disease or dying, that leads me to my own ideas about how much I might endure, because I won't endure it alone or without cost to others.
Cancer is no longer a definite death sentence and chemotherapy can make it go away for good, depending on what kind of cancer that is. I'd refuse too chemo after chemo in a very aggressive form of cancer though.
One of Don DeLillo's later good novels is about this stuff (Zero K).
I always think people's attitude toward possible future worlds is interesting. You can see a wide spread of opinion in this thread -- whether you think functional immortality would be a good thing says a lot about who you are. Ditto for colonizing other planets, automating all work, building AGI, and so on.
I suppose I'm on the side of the technologists. I think immortality is probably possible and humans should try to achieve it. But along the way it will mostly be snake oil and cults. And, of course, it's all but guaranteed that everyone in this thread isn't going to make the cut-off.
I'm certain immortality is possible, and it's also likely to be achieved, because we always do everything we can do, regardless of consequences.
But I think this is the acme of selfishness. I don't want to be immortal, and I wouldn't want to live in a world with 500-year-old know-it-alls running around "oldsplaining" everything to everyone else.
I have, thankfully, a fairly good chance of dying before that happens.
How is immortality selfish? Selfishness requires taking from other “selves” who have unmet needs of their own. But there’s every reason to believe a society of immortals could either function perfectly well without producing new selves, or that it could choose to reproduce at a slow rate sustainable with its ability to extract resources to support itself. Any new selves that were born would be provided the same opportunities that we provide new selves in the present day—breastfeeding, education, healthcare. How would that be “selfish?”
Is it selfish when a centenarian lives past 100? Is each additional year of life obtained by a centenarian “selfishly” stolen from some hypothetical unborn self?
Life expectancy has been increasing over time, especially in the past century or so. I don't think it's credible to suggest that civilizations have progressed meaningfully slower now that people live to be 80 or so instead of only 30, which was common in recent history.
And even if immortality "stalls" humanity, so what? People matter, not technology or some amorphous concept of "progress".
1. It starts with the humans being deathless, and then death showing up. This is not how things are, all life we know of since the very beginning dies. It may well be a "feature" of life in that death is necessary, or at least useful, for life to adapt and grow - unlike the dragon, which showed up after life was already doing fine.
2. It very conveniently and explicitly frames death as an evil black dragon, anthropomorphizing it and pulling it out of the mechanical process that it is into a morally evil villain.
3. The difficulty and novelty of the discoveries we must make and engineering problems we must solve to beat death are several magnitudes smaller in the fable than in reality.
It seemed very pandering, and sets up death (and those who propose accepting its inevitability) as a straw man and those who want to beat it as a steel man. I would have been more sympathetic if the black dragon were some illness like cancer, but it seems to be an allegory for death itself. I'm not against research to improve health and slow aging, but I also think that there are much more important and realistic problems to solve than defeating aging.
You’re being defeatist and ignoring the evidence presented in the article—even hospice patients want to live longer. You, too, will desire to live before (and hopefully: if!) you breathe your last.
This is because the entire goal of the sentient consciousness is simply to preserve itself as long as possible. DNA has the essential goal of replicating itself in reproduction. Consciousness, by contrast, appears to have no goal other than self-preservation. People sometimes choose to sacrifice themselves, but usually only when death is inevitable and they wish to save someone else from it (Lily/Harry Potter and Medal of Honor type situations).
I'm not really being defeatist nor ignoring evidence. Perhaps I just have a different perspective. There can be moral/ethical arguments for why mortality is a good, or at least useful, thing.
Why 100? You can also make way for the next generation by living 80, or 60, or 40 years. Yet no one would be okay with that option. Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100. So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades. Why do you think that won't change drastically another few decades from now?
"Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100."
That isn't really true. Life expectancy was historically driven down by high infant mortality and lack of medicine. The meaningful human lifespan has been in the 70s for the majority of history. (Lifespan is different from life expectancy)
"So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades."
Who says it's a strong belief?
"Why do you think that won't change drastically another few decades from now?"
Because there's no real evidence to support that. Life expectancy hasn't gone up drastically over the past 50 years. Rates of chronic illnesses, including things like dementia, have gone up drastically. So even if people are living a couple years longer, they're generally sicker and it's costing more. Even if medicine makes drastic improvements, 100 is still a lofty goal. I'd be fine making it only 80 too. I'm actually skeptical that will even happen. What I do know is that I don't want it to take more than 100 years.
That's fine, but please don't stand in the way of those of us who would love to experience the world on a longer time frame, and are frustrated that the current level of medical knowledge doesn't allow it.
Yeah, I find the need to live forever kind of.. juvenile? You can’t let go of your ego for long enough to realise that at some point it’s better to make room for a new human with new perspectives and new ideas?
I like to think of it this way: if life was a game would you want to play the same character forever? No.. if you’re gonna keep playing the game it’s more interesting to start from scratch now and then. I don’t believe in reincarnation. There’s no need to. What you really are deep down is an instance of humanity. Almost all your genes and all your culture comes from and is shared with other humans. Any new instance (new human) is you playing a new character, essentially. If you’ve contributed to shaping the world you’re leaving behind this is even more true.
Unless you’re believe in a soul in the christian/jewish/muslim sense I guess, but then why would you fear death?
IMO the pursuit of immortality is far more dangerous and far more likely to kill humanity than AI. At least it may make us deteriorate to insignificance. Humanity is a super organism and we have a name for the phenomenon where parts of an organism figures out how to die and yet still replicates: cancer
We don't need to live forever as shown by the fact we've got by without it so far but death is kind of depressing. I've never really got the distinction that say killing millions in the holocaust is terrible but similar millions dying through age is desirable.
This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
It's really kind of odd that people talk about brain transplants, or living forever, and then talk about uploading your mind to the cloud or swapping someone else's healthy brain into your healthy body, or scanning your brain and recreating it, and make it sound like it is going to help you to live forever. If my brain is scanned and uploaded to the cloud, a copy of me would live on forever, but me, literally, me, will still deteriorate and die, then rot away into nothingness. My mind, my personality, my essence, me, will not live forever under any of these solutions. I'll die. A copy of me, a point in time snapshot of who I was when it was taken, will continue on, with my memories, mimicking who I was. That gives me no comfort at all. I don't care if a copy of me lives on forever if I still die. That's not immortality, not for me personally, not if I die.
So, each morning is a copy of you wakes up? Or do you draw a line in the sand and say that is different.
I think it doesn't "shut down", maybe fades away to a different, low-power mode. When you go to sleep and then wake up, here is still a continuity, because of the same underlying structures and processes from which the consciousness emerges. So it is like a 4D thing.
That continuity never really breaks (until death? which is like singularity) and I think this is what makes you "you". You can't copy/teleport it (kinda by definition), but you can extend/transform it.
Perhaps a "Ship of Theseus"-style approach would work — gradually replacing neurons one by one, or brain sections piece by piece. Alternatively, the brain could be extended with peripheral "devices" connected to it with bandwidth comparable to the interconnections within the brain itself, up to the point until the biological part becomes irrelevant. This is similar to how a portion of the living brain can be removed while still preserving consciousness — as neurosurgeons often demonstrate.
Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
> So, each morning is a copy of you wakes up?
Obviously not. Many of these processes also happen in reverse when you are awake. So when I go to sleep I'm tired and in a different mood. When did that "copy" get made exactly?
> Or do you draw a line in the sand and say that is different.
You can easily ask "how much of my conscious state is controlled by my body's chemical state?" If that's a non trivial amount then simply copying and hoisting my memories into a different body is clearly going to have a non trivial impact. To the extent that it's easy to say "that would not be /me/."
The best way to go about this IMO, is nano bots first augmenting the brain, connecting to the cloud, then replacing the brain slowly and introducing syncing consciousness when online. Replace a little every night and then it would still be you IMO since you're replicating most closely what the brain already does.
There will never be two of me in my bed when I wake up, so I’m the same.
If my mind could be copied and embedded in a robot or whatever, then there are two of me, each now diverging. That other one is not me, as far as I, the other, am concerned.
Otherwise I actually agree with your point ("every night another little death", as the Sonic Youth song says): I'm different every morning. Usually, the differences are subtle. Over the time span of a lifetime though, I've been through enough changes that I often say ghosts of former selves share this body, and what "I" am now will probably be another ghost by the time the biological substrate dies and takes whatever "I" will be then, along with all the ghosts, with it (in my case we'll go up in smoke since I'll be cremated).
This is provable - only one instance can wake up, whereas with the other method, it’s conceivable that two or more copies (instances?) could wake up.
On the other hand, I do seem to have a vast store of dream memories; wnen I think about them it seems like living some kind of second life, usually while fast asleep. Sleeping and waking selves are in agreement that being asleep is generally more pleasant than being awake, though being awake seems to have better continuity.
So, when you state that robo brain is still "you", I have to disagree. While robo-brain can still believe it's "you", it can't possibly be "you", as "you" could potentially still exist in the original biomechanical device, which isn't interpreting the new body's stimuli, and thus not building a sensory profile based on the stimulus. Even if there was a transfer that resulted in the original device's (i.e. your body) death, the new body still couldn't be considered "you" as there isn't anything physically different about the two scenarios for the body.
Of course, if you consider every clone to be an extension of your consciousness, then you could argue all this. But at that point I think we'd have vastly different interpretations on what consciousness and "you" are. You could also argue that there is something physically different about the two scenarios presented in the robo-brain example, but the only way I can imagine a sincere argument for that is if you told me it was due to the transference of some soul-like entity. But, of course, if you were to tell me that, I would simply disagree.
What consciousness? Yes, I am aware that "sleep" means that a person becomes unconscious (trivially demonstrated), but that is not the same thing as "consciousness shutting down". Supposing there is such a thing as consciousness, it doesn't seem to be anything that is suspended by the sleep process. And that's also a big "if", because consciousness is just woo-woo nonsense talked about but never defined by people who haven't quite gotten over the idea that humans don't have some immaterial soul.
Although, I'll grant that the subjective experience may be fairly similar. If you were to theoretically be copied, your copy would probably feel quite continuous with the original.
Sure, but I'm ok with sidestepping that totally pedantic concern and still considering me the same person, like I do every single day.
Mere point-in-time-copying to another substrate, though, not so much.
You’re not unconscious while you’re sleeping.
Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
Even if you somehow made a perfect instant copy, they'll start drifting apart, as they'll be experiencing different things.
Your replica however would have a different experience and be able to draw on your memories etc. But you yourself will in fact live a natural life with no shared experience with your replica and will experience death no matter what. It’ll be little solace to you that a copy got to live your life after your passing.
How is that "scientifically" an "of course"?
How is it more "an exact copy of you is you" than the alternative claim, "an exact copy of you is 'you (1)'" (to borrow from file manager nomenclature).
The trivial example of that seems to be that if you make an exact copy, put it in a different room, and then something happens to the exact copy, that thing does not happen simultaneously to the original copy.
And I'm not sure that's the only issue even for die-hard materialists : think for instance about all the problems that come from multiple exact copies & things like ownership, relationships...
It is enough to understand that having an independent "command & control" structure (for lack of a better word; C&C) is by definition 'you'. C&C means that the individual can perform a self-contained sense-think-act cycle.
We can infer this from the following: 1. Identical twins, despite having identical genetic structures, have separate C&C, and therefore qualify as separate persons. 2. Two siamese twins conjoined by the body (including parts of the head) have two C&Cs, and therefore qualify as two persons (with varying level of bodily control per person). 3. An individual with additional parts of the body (e.g., in a special case of being conjoined with a twin) has only one C&C and therefore qualifies as one person. 4. Clones are a special case of identical twins, and still have separate C&Cs, therefore qualifying as two persons. 5. A person in a coma or with another condition nonetheless qualifies as a person with some sense-think-act autonomy, despite being dependent on someone else for continued bodily functions. 6. An individual with a transplant (e.g., heart) or an extension (e.g., pacemaker) is nonetheless a person because of their consistent C&C. 7. An individual with extreme memory loss remains a person with C&C (in brain, body, genetics for the most part).
Any other special but naturally occurring cases (e.g., individuals with two brains, individuals with completely separated hemispheres, conjoined twins sharing a brain) would require either that we: a. understand how brains and bodies work (and therefore, make a more quantified measure of how much of sense-think-act autonomy qualifies as personhood); or b. decide based on a heuristic (1 brain = 1 person) or principle (protection of bodily autonomy, i.e., 1 body = 1 person).
But none of these need you to believe in a soul, in order to think that a digital clone is not 'you'. Unless, of course, you can prove that an individual either: 1. can sense-think-act with both physical and digital 'bodies' simultaneously and/or sequentially*, which share an equivalent experience (i.e., central C&C coordinating distribution or maintaining strict consistency across bodies). 2. can sense-think-act with either physical or digital 'body' at will, upon establishing a 'connection' and successful transfer of state (i.e., remote access). 3. can transfer whatever is necessary for C&C to another 'body' (e.g., brain transplant), and demonstrate beyond reasonable doubt that their C&C signatures in new body match the ones in the old one.
1. If we are entirely material, we can't know anything. For example, if you're thinking about a tree, there must be something about you that has 'grasped' what it is to be a tree. This is impossible if your idea of a tree is nothing more than a bunch of neurons firing in a given way. There is no correspondence between neurons firing and a tree. So there must be more to your thought than neurons firing (or any other material process).
2. Similarly, if we're entirely material, we can't reason. A reasonable thought differs from an unreasonable thought only by its content. It can't differ in this way by its material components in the brain. Therefore, if we hold that some thoughts are wrong and others are right, we must conclude that there is more to our thought than the material.
Yeah, that's what my clones keep telling me. I ask them: have you ever woken up inside a cloning machine? That probably means you're not the original. Then they somehow sedated me and locked me inside the cloning machine.
All subatomic particles, quantum-woo-woo including all their entanglings?
Weakest electromagnetic fields as resultant emergent pattern of these?
That's just one opinion, though. We still don't know what consciousness really is. We don't know what makes us who we are. We don't know if some concept (scientific or supernatural) of a "soul" exists apart from the behavior that arises from the physical structures of the brain.
We just don't know. So maybe you're right, but that's far from clear.
There would be thousands of copies that experienced going to bed and then nothing afterwards.
Looking forward from now, both bio-me and digital-fork-me (or indeed bio-fork-me if the upload gets revented) are equally "my future self", even though they would also each be distinct beings in their own rights and not a shared consciousness over many bodies.
I find it fascinatingly coincidental that neurons are the only cells in the body that don't rejuvenate unless there is some sort of injury [0].
[0]: https://www.dzne.de/en/im-fokus/meldungen/2021/neurons-able-...
https://en.wikipedia.org/wiki/Ship_of_Theseus
Technically, we are not the same people that we were when we were born. The cells in our body have one by one been replaced with new ones and the old ones have died off. The things that make up my body are 100% different than when I was born, so I literally am not the same person, physically.
Maybe this is an indicator that there is more to what makes us, us, than just the physical assembly of atoms in our bodies. There are things I don't know that we'll ever get a full understanding of.
The above scenario is if you get re-implanted into a self-evolving, autonomous biological entity, unlinked again from the system. If this is not feasible and the only solution is to embed you into a robot with an uplink to the internet, because "why not?", then my biggest issue with a digital self is that there are no guarantees of having a proper firewall, which would equal to total surveillance:
https://en.wikipedia.org/wiki/Die_Gedanken_sind_freiEasy to prove/disprove once you can actually copy a human being, as he should then report to be simultaneously aware of both bodies.
Other than that it's possible that by the same logic of you not being the same as your copy, you're not being the same after you sleep: the continuity of your consciousness is interrupted, and the new you in the morning thinks he is the old guy, having all the memories, but he actually isn't. Think about this: what if you were replaced with a clone while you sleep? No one would ever know and scientifically speaking both scenarios are equivalent because they lead to the same result. Maybe enjoy today a little bit longer, now knowing you're actually going to die in that bed, and as your "clone" will wake up in the morning completely oblivious of not actually continuing your consciousness, you don't even get a witness to your tragic death, as was the case with thousands of your predecessors.
Or maybe it's even worse, because the consciousness is interrupted tens or hundreds of times during the day. Or maybe those are thousands of separate, overlapping consciousness bubbles and really you should worry only about your identity, as the consciousness is just short-lived fireworks on top of the actual you.
I guess it would be nice to know what consciousness even is.
This reads a bit like "a copy of Super Mario Bros isn't Super Mario Bros, it's a copy".
It has all of the bits that let you distinguish Super Mario Bros from Donkey Kong or Super Mario World. Why isn't it also Super Mario Bros?
Even 5 minutes from now, you don't know where you'll be, and you will be a completely different person based on the events of that 5 minutes.
Every moment that passes, the you of the moment dies.
What you can have, is a you in the future that remembers and understands the past you. That only happens if you preserve your memories either by not dying, or by preserving your internal memories in some other way.
That's all you get. A copy in the future that remembers, or nothing at all.
thinking about this even a little makes me want to throw up because this can never be tested. yeah, id be dead, but technically, im alive. im alive according to everyone including myself (new me). So im not dead. But like you said, i died and am dead and dont experience any more life. if all evidence is that i didnt die and still exist, then thats the fact
we can theorize that our consciousness continues, but there is NO WAY to ever actually test it because all experience are that consciousness continues whether or not that is the case. sprinkle in some quantum suicide and my stomach is doing backflips from the existentialism od
What if: at the point of copying there are two of you, both of whom don't want to die?
It's all just hypothetical without knowing what consciousness/qualia really "is," and as far as I can discern we've basically moved no closer to understanding that since the very beginnings of neurological research.
Are electrical processes rebooted from chemical memory every morning when you wake up or after you have a seizure still the same you? Or is it just a fresh copy of your mind that dispersed when electrical signals in your brain lost continuity?
Deleted Comment
Deleted Comment
OK, now your consciousness is 'transferred'. I promise to run you.exe plenty. It's just an exercise in self delusion, even if this was possible. That's not you anymore than a 'you' created by a taxidermist and shoved in a closet is.
It used to be that what made you alive was electricity; you could Frankenstein shock yourself back to life.
It used to be that you were a clock. Your gears wound up at birth, and then eventually you wore out. You needed repairs.
People love to use metaphors, but none of these things are the full picture. Just because computers are so complex doesn't make you more correct. Your brain isn't the whole of your mind, we already know that's true. Why is this silly nonsense entertained?
What happens if you have an identical copy down to the atom. Totally impossible to distinguish. You're put to sleep and your mind is copied over. When you both wake, which one is "you"? Both copies think they're "you" and both are correct. Each has the same sense of self and continuity of identity. Maybe at the moment of synchronization "you" ceased to exist as you branched into two new identities.
Say you upload into a computer. From the copy's perspective, it is "you", it's the same self. It might view your biological body like you'd view an old computer after you finish migrating your files to a new one.
Say you destroy the biological body, or wipe its brain once you copy. Does that change the equation? If you destroyed one of the identical clones, is it even relevant to ask which is "you"?
Personally, I think Altered Carbon comes close to how our society will solve this problem. The original copy has its brain wiped and the new copy is "you" by cultural consensus. Truly duplicating a single identity is strongly taboo and illegal.
I think this is a question that either matters to you or it doesn't. In my opinion, it's irrelevant. I, the original "me" am totally free to agree with the copy that it is "me". I can choose to give it my social and legal identity and agree that "me" is no longer "I". My personal choice is to destroy the original, but one could also choose to let the original body continue and assume a new identity, live a new life or whatever.
I view this the same way I do my past self. The person I was ten years ago is not "me", it was a totally different person. That past self lived a different life and had a sense of identity that isn't at all like what I have today. That past me, the ego that called itself "me" died a long time ago and the "me" now is a different self identity built out of new pieces. In my worldview, "me" is a transient thing. The self is not one fixed thing, it changes over time and sometimes it's replaced. I don't see the idea of transferring my sense of self to a new body as anything more extreme than any other kind of ego death.
I choose to resolve this problem with practicality. I agree with myself that the new copy is "me". My social and legal identity, as well as my own sense of self transfer to the copy. My choice would be to destroy the original copy. Barring that, it would assume a new identity and live a different life far away. It'd get a memory wipe if available. I can make the choice to sacrifice my ego and allow "me" to be assumed by someone else. To me, even to the original copy, the new copy is me. In my opinion, "me" is immortal in the digital copy, even if "I" remain in a mortal body.
Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
More recent research has been hinting that we may even hold certain types of memories outside our brains.
Humans have always been drawn to neat, tidy ideas, especially ones that draw clean boundaries: it's an appealing idea that our consciousness lives solely in our brains, and that our brains could function independently of our bodies, but it seems unlikely that it's really that simple.
Anyone claiming to take your brain and slice it up and have a working model right now is currently selling snake oil. It's not impossible, but neuroscience has to progress a ways before this is a reasonable proposition. The alternative is to take the brain and preserve it, but even a frozen or perfused brain may have degraded in ways that would make it hard to recover important aspects that we don't yet understand.
It is, however, fascinating to do the research required to answer these questions, and that should be funded and continue, even if just to understand the underlying biology.
Well anyway that's my airchair crackpot neuroscience theory for the world to consume ;). I'm sure there must already be a name for the idea though.
On the other hand, the idea of "simulating your consciousness" raises questions beyond just cognition or personality. A mechanistically perfect simulation of your brain might not be conscious at all. Spooky stuff.
(I'm just a programmer so it's fascinating to me to consider how actual brain scientists model consciousness in their work.)
https://www.mdpi.com/2673-3943/5/1/2
https://pubmed.ncbi.nlm.nih.gov/31739081/
And conversely, I think Kim Stanley Robinson puts it best in the Mars trilogy. Scientific progress often has to wait for the old guard to die so new ideas can be tried. Sometimes there are actually new things and they need to be allowed to cook.
German physicist Max Planck somewhat cynically declared, science advances one funeral at a time. Planck noted “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with it.”
A scientist like Einstein experienced scientific revolutions within his lifetime. That's hardly going to be the norm in the history of science, and also a horrible assumption to think revolutions would endlessly be occurring and reoccurring.
Also, we know when we're on the edge of knowledge, especially in cosmology and physics. We're waiting for revolution there. There's dark energy and dark matter. It doesn't matter if you're old or young, you knew that your theories isn't good enough to explain whatever these are.
Scientific knowledge don't get swept away especially if they're rock solid. Newtonian physics still has a lot of relevance after all. It's just that relativity is even more accurate.
Plus there are a lot of assholes in the world. Come on, there isn’t anybody you’d enjoy watching get Ozymandias‘d? I’d enjoy it.
I think that may be true enough, but it doesn't have the upshot you seem to think it does.
It just means that what we need to sustain not just a brain itself but the totality of the environmental conditions on which it depends. No easy task for sure, but not something that presents an in-principle impossibility of preserving brains.
I think there's a major philosophical error here in thinking that the added logistics present this kind of in-principle impossibility.
Also, talking like this starts to play with anti-science speculation a bit. Octopi actually have neurons extending through their limbs. We don't. So when we talk about consciousness being "embodied", I'm sorry, it's an attempt to romanticize the question in a way that loses sight of our scientific understanding. Consciousness happens in the brain.
Sure, the brain needs stimulus from its embodied nervous system, and may even depend on those data and interactions in significant ways, but everything we know about consciousness suggests its in the brain. And so the data from "embodied" nervous systems may be important but there's no in-principle reason why it can't be accounted for in the context of preservation.
You don't have neurons extending through your limbs?
Of course the nervous system is much more complex, but damage to the brain almost always result in some sort of cognitive dysfunction or personality change, see the Phineas Gage case for example.
"There aren't any changes except for all of the changes, but those changes don't count because reasons."
I don't know how many amputees you know; you may know many. I was in the army for 10 years during the height of the global war on terror and know more than most. Not a single one is the same as they were pre-amputation. Could be the trauma that caused the amputation, could be the amputation. I'm not an amputationologist.
I do assert that a holo-techno-brain will need a shit-ton of e-drugs to deal with being amputated from its fucking body.
The bacteria in your butthole are a part of you just like your brain, maybe less, but they ARE a part of you.
But I would assume that bringing someone back would be tied to a physical or simulated body that provided a compatible context.
Not a bad assumption to solidify in your brain preservation/restoration contract.
Transferring our consciousness into "the net", or some other fuzzy concepts are so far removed from reality as to be complete fiction. This includes freezing our brains and reanimating them later to resuscitate our lives.
They not only massively overestimate the functionality of today's tech to receive something like our consciousnesses, but even more so, by orders of magnitude, underestimate just how complex our living bodies are.
We only have the vaguest of ideas about how our physiology works (while we might be able to replicate flesh cells for "fake meat", we have 0 understanding or control over how those cells organize to form macroscopic organs). Applying this to the brain, our understanding is even more primitive. An example would be recent news that perhaps the brain is not sterile, but hosts a microbiome. Whether or not the brain hosts a microbiome is still "controversial".
We're still hundreds of years away from a comprehensive understanding of physiology.
But of course, we're never going to live that long, because we still believe (statistically as a species) in invisible guys in outer space that tell us we need to dismember people who believe in the WRONG invisible guy in outer space.
Our primitive violent ape species will extinct itself long before we ever have a comprehensive grasp of how life works, especially to the level of understanding consciousnesses...
I suppose there are valid use cases for this, but I'm not that narcissistic to think the world needs eternal copies of me.
The continued subjective experience of the original consciousness is where I believe the real value lies. Digitisation of consciousness, assuming it has any sound scientific basis in the first place, would practically need to look more like the gradual replacement of brain (and bodily) matter with something more durable, enduring, and controllable. A slow process in which carbon is exchanged for silicon, or cellular damage is continuously reversed and aging kept at bay.
There is no continuity of subjective experience even within the same brain, you can be deeply unconscious for extended periods of time and come back.
You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Think about this: For every consciousness (including you right now) it is _impossible_ to experience anything other than what the thing producing that consciousness produces (memories, sensations, etc.). It doesn't matter whether the different conscious entities or whatever produces them are separated by time or space. They _will_ be produced, and they _will_ experience exactly what the thing that produces them produces.
With an analogy: If you drop pebbles in either the same pond at different times or in different ponds at the same time, waves will be produced in all cases. From the perspectives of the waves themselves, what they interact with is always _exactly_ the stuff that interacts with the water they're made up of. To them, the question of identity or continuity is fully irrelevant. They're just them.
Similarly, it makes no difference whether you only have the memories of the previous conscious experiences, or if 'you' really experienced them. Those situations are indistinguishable to you. The link to future consciousnesses inhabiting your body is effectively the same.
To me, excessive negativity about the possibility of immortality smacks of weakness and defeatism. You either love life and want as much of it as possible, which makes you a friend of humanity, or prefer death, which makes you an enemy of humanity. I take a stronger line than the neuroscientist in the article. “Death positivity” like that of Viktor Frankl, anti-natalism, even faith in magical spiritual resurrections—all are anti-human viewpoints, only excusable in the past because they were copes with the inevitability of death. Now that we have reason to believe it can be averted, we owe our potential future selves every possible effort to save them from oblivion.
Most people are alive right now. The population historically has been much lower, so odds are you would be born around the time high technology would support a high population.
Not just hinting - the evidence is strong and accumulating rapidly. The gut, in particular, has so many neurons that it is considered the body’s “second brain”, to say nothing about the impact that gut bacteria have on your mind.
If you really wanted to create a copy of your “mind”, you’d have to image every neuron in your body for a thoroughly accurate copy. And then accept the fact that your entire behavioural profile is then missing the input of your gut bacteria, which appears to have a significant and non-trivial impact.
But I usually move the hard drive (or at least its contents) between machines when I get a new computer, and that's enough for me to think of it as the "same", even if I reinstall the OS on the new machine and just copy my home directory onto the new one.
I don't want to be a brain in a jar. Or in a computer either. I enjoy experiencing physical sensations and interacting with the world in meatspace. And if I can't enjoy either, then just let me die.
And I apply this to not just brain preservation, but any attempt to artificially prolong the quantity of my life at the expense of the quality of my life. I do not want to spend my last years in a hospital bed hooked up to machines and unable to move. That was how my dad died, and even then he was lucky enough his partner (who he had discussed this with before and who had the authority to make the decision) eventually agreed to switch him to palliative care in his final hours. Similarly, I have seen what chemotherapy does to people, and I have long since decided that if I ever get cancer, I will refuse chemo and let myself die. I am also having a living will drawn up that includes a DNR order, multiple scenarios where doctors will be ordered to pull the plug, and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
I will make sure I die with my autonomy and my dignity intact.
[0] Al Ewing. He writes comics. Read his stuff, he's good.
I was particularly struck by:
> if I ever get cancer, I will refuse chemo and let myself die
And figured this quote must be at least 20 or 30 years ago? Cancer isn't necessarily a death-sentence, and many treatments are much less harsh than they were 20+ years ago.
> I will make sure I die with my autonomy and my dignity intact.
Amputees have autonomy, dignity, and rich lives. To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
What if instead of requiring an amputation, he loses faculties by accident like suffering from parosmia due to COVID or having a weight crush a body part? Did he suddenly lose his dignity? He certainly lost some autonomy. What’s the next step then?
No. Denigrating someone expressing their personal opinion seems absurd. Since the commenter did not impose their opinions on other people you had to put those words in their mouth to call them insensitive.
I prefer to die with autonomy and dignity as well, meaning I would like to pull my own plug. That other people might have a different threshold, or want to die differently than I might, seems neither absurd nor insensitive. The commenter just described their threshold, they didn't judge other people.
The prohibition against amputation and sensory organ removal is a bit nuts too. You'd rather die than have someone remove one of your eyes or ears, or say a hand or arm or foot or leg? That is profoundly sad, and intensely insulting to anyone who has had to deal with that sort of thing and has nonetheless lived a full, rich life.
I get that many medical interventions do actually have a terrible, permanent effect on quality of life, but these seem like pretty extreme views that ignore reality.
For me the question goes beyond "Can I survive chemo (or amputation) and resume something like a normal life?" When you have to face cancer or loss of a limb or any illness or injury that threatens your life, or perceived quality of life, or dignity and autonomy, you necessarily have to think about what that means for your future. Until you get a diagnosis of (for example) cancer you don't know what it feels like, or how you will react, to the fact that no matter if you survive the treatment or not, you will always have that threat and reminder of your mortality in your conscious thoughts. You think about how you might not get so lucky the next time, how much your treatments might cost, what your illness might put your loved ones through, how far you will go to keep yourself alive even when it imposes costs and obligations on other people. And you think that maybe other people will have to make hard decisions about your future if you can't. A cancer diagnosis doesn't just affect me, in other words. If I lost a leg or arm that would impose burdens on my wife and family, affect my ability to make a living. Those thoughts more than the medical condition itself lead people to arrive at opinions such as the original commenter expressed.
Having faced my own mortality already I know I think more about how my own end of life scenarios affect other people more than how they will affect me. I worry that I will suffer a stroke, or slip into dementia, before I can pull my own plug, leaving people I care deeply about with that awful obligation, and the burden of caring for me rather than living their own life. And it's that thought, not the fear of disease or dying, that leads me to my own ideas about how much I might endure, because I won't endure it alone or without cost to others.
Who says a brain in a jar can't enjoy either of these? Who says that isn't, in fact, what you are enjoying right now?
Deleted Comment
I always think people's attitude toward possible future worlds is interesting. You can see a wide spread of opinion in this thread -- whether you think functional immortality would be a good thing says a lot about who you are. Ditto for colonizing other planets, automating all work, building AGI, and so on.
I suppose I'm on the side of the technologists. I think immortality is probably possible and humans should try to achieve it. But along the way it will mostly be snake oil and cults. And, of course, it's all but guaranteed that everyone in this thread isn't going to make the cut-off.
But I think this is the acme of selfishness. I don't want to be immortal, and I wouldn't want to live in a world with 500-year-old know-it-alls running around "oldsplaining" everything to everyone else.
I have, thankfully, a fairly good chance of dying before that happens.
Is it selfish when a centenarian lives past 100? Is each additional year of life obtained by a centenarian “selfishly” stolen from some hypothetical unborn self?
"Do you want to live longer?"
"Yes"
"OH YOU WANT TO LIVE A MILLION BILLION YEARS?!?!"
There are values in between immortality and ~80 years.
Life expectancy has been increasing over time, especially in the past century or so. I don't think it's credible to suggest that civilizations have progressed meaningfully slower now that people live to be 80 or so instead of only 30, which was common in recent history.
And even if immortality "stalls" humanity, so what? People matter, not technology or some amorphous concept of "progress".
https://qntm.org/lena
CGP Grey captures this sentiment nicely in this animated essay: https://www.youtube.com/watch?v=cZYNADOHhVY "Fable of the Dragon-Tyrant" [2018-04-24]
1. It starts with the humans being deathless, and then death showing up. This is not how things are, all life we know of since the very beginning dies. It may well be a "feature" of life in that death is necessary, or at least useful, for life to adapt and grow - unlike the dragon, which showed up after life was already doing fine.
2. It very conveniently and explicitly frames death as an evil black dragon, anthropomorphizing it and pulling it out of the mechanical process that it is into a morally evil villain.
3. The difficulty and novelty of the discoveries we must make and engineering problems we must solve to beat death are several magnitudes smaller in the fable than in reality.
It seemed very pandering, and sets up death (and those who propose accepting its inevitability) as a straw man and those who want to beat it as a steel man. I would have been more sympathetic if the black dragon were some illness like cancer, but it seems to be an allegory for death itself. I'm not against research to improve health and slow aging, but I also think that there are much more important and realistic problems to solve than defeating aging.
This is because the entire goal of the sentient consciousness is simply to preserve itself as long as possible. DNA has the essential goal of replicating itself in reproduction. Consciousness, by contrast, appears to have no goal other than self-preservation. People sometimes choose to sacrifice themselves, but usually only when death is inevitable and they wish to save someone else from it (Lily/Harry Potter and Medal of Honor type situations).
That isn't really true. Life expectancy was historically driven down by high infant mortality and lack of medicine. The meaningful human lifespan has been in the 70s for the majority of history. (Lifespan is different from life expectancy)
"So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades."
Who says it's a strong belief?
"Why do you think that won't change drastically another few decades from now?"
Because there's no real evidence to support that. Life expectancy hasn't gone up drastically over the past 50 years. Rates of chronic illnesses, including things like dementia, have gone up drastically. So even if people are living a couple years longer, they're generally sicker and it's costing more. Even if medicine makes drastic improvements, 100 is still a lofty goal. I'd be fine making it only 80 too. I'm actually skeptical that will even happen. What I do know is that I don't want it to take more than 100 years.
I like to think of it this way: if life was a game would you want to play the same character forever? No.. if you’re gonna keep playing the game it’s more interesting to start from scratch now and then. I don’t believe in reincarnation. There’s no need to. What you really are deep down is an instance of humanity. Almost all your genes and all your culture comes from and is shared with other humans. Any new instance (new human) is you playing a new character, essentially. If you’ve contributed to shaping the world you’re leaving behind this is even more true.
Unless you’re believe in a soul in the christian/jewish/muslim sense I guess, but then why would you fear death?
IMO the pursuit of immortality is far more dangerous and far more likely to kill humanity than AI. At least it may make us deteriorate to insignificance. Humanity is a super organism and we have a name for the phenomenon where parts of an organism figures out how to die and yet still replicates: cancer