Yes, of course both are. But still, you're only ever going to be conscious from the perspective contained within the original you. The copy may be mathematically identical, but it does not make sense that there would be any kind of a magical transition of consciousness from one to the other. The other "you" would have its own consciousness entirely separate from yours. I.e., if you copy yourself and kill the original, you will lose consciousness. You will not wake up.
Imagine I have a magical device with which I can "stop time", or more precisely, stop movement of all particles except mine. In the beginning I am at our right. I stop time, move leisurely to your left and resume time.
What you see is me teleporting from your right to your left. You don't feel "trapped in a body who can't move", because your neurons are also magically frozen. Your consciousness feel continuous, even is you have been "suspended" for a while, because your memories are properly structured. Consciousness is not "something else", it is a property of your memories. (The blog post on lesswrong.com about timeless physics is important to see this point of view, I think, even if you don't buy the theory).
So if my copy is mathematically identical then there is nothing else left. There is no extra consciousness stuff that the "original" has and the "copy" does not.
Sorry, I really can't explain myself better. I'd suggest reading the link provided.
I will read what you suggest, but perhaps you can illustrate how would the sensation of identity will transfer from the physical body to the digital body? Because your example does not explain it.
Let's suppose we believe that everything about me can be contained in the physical processes that happen in my body.
Now suppose that we have the ability to digitally simulate those processes perfectly. Since my existence is purely physical, the "sensation of identity" you describe is perfectly captured by this digital simulation, by definition.
It could be duplicated, but then it would be a copy and not the one you currently have. I expect you can easily imagine this by picturing your copy uploaded to the simulator while you live and then running the simulation.
Whatever the copy does after that you won't experience and it won't be part of your memories. If you say that's not the case and that you will experience and be aware of what the copy is doing, then you are proposing some sort of metaphysical connection between the two beings, which I find hard to swallow.
I like this definition of identity: "A person's identity is defined as the totality of one's self-construal, in which how one construes oneself in the present expresses the continuity between how one construes oneself as one was in the past and how one construes oneself as one aspires to be in the future"
I don't know where you're getting the idea about a psychic connection. That's a strange idea, and I don't see anyone suggesting that.
What I see you doing is presupposing the "I" and "it" beforehand as if the future clone does not share all your memories and experience. Right now the clone and you are one and the same. Just as you can imagine a copy uploaded to a simulator and ran, it is equally valid for you to imagine blacking out during the brain scan and waking up in a virtual world. Before you undergo the brain scan it would be wise to prepare yourself for possibly waking up as the clone.
I and my copy are the same only in the instant the copy is made 'alive', the second after that we will not be equal, and every second that passes we will diverge more, unless there is some kind of metaphysical connection.
The virtual world copy will of course be (just like) me the instant it wakes up, but I (the original) won't experience the virtual world. I don't see how the original could wake up in the virtual world.
There are three states of being here, and you're confusing two of them. First, there is pre-you: the you before the cloning point. After that there is the original and the clone, both of which share the exact same pre-you experience. Just because you are pre-you doesn't mean you will be the original after the cloning point, since both the original and the clone branch from pre-you. If you are the copy, your experience is that you were pre-you first and then suddenly you woke up in a virtual world. This is illustrated simply by looking at the fork() command. You cannot make a fork() call and then have the code afterwards presuppose that it is the parent process without checking the return code. In the same way, you cannot be sure that you are the original after the cloning point unless you have solid evidence of it.
It would be an exact copy of your mind. Think about that. All of your experiences, memories, thought-patterns, etc., including all of your presuppositions that you will be the original. You sit there and smugly tell yourself that you will not be the clone. You couldn't possibly wake up as a clone, right? That the clone is going to be this "other" thing over there that has nothing to do with you. Guess what? The clone wakes up having had all of those exact same thoughts and experiences. The clone is you. I wouldn't recommend that anyone in this state of mind go through a cloning process because it would just end up with a confused, depressed and generally fucked up clone.
I fully agree that the copy will rightly believe it's the original, and for all practical matters to the rest of the world he can very well be considered an original, if he can at least communicate with the outside.
But this just doesn't consider the fact that in the real world, there was a real original who went to a copying facility and then went home, in the physical world. This person does have the return value of fork() [$] and does not experience the virtual world.
It's in this sense I'm saying I can't imagine going to a copying facility and waking up in a virtual world. I can perfectly imagine a copy doing that, but it won't have my future experiences. In fact, going further, given my beliefs, were "I" to wake up in a virtual world I'd be sure I'm a copy, because I'm certain the original could not wake up in a virtual world.
[$]: As long as we don't get fancy with psychothriller manoeuvres where the original is drugged and the copy has a body clone that returns home to his wife, while everyone tells the real original he's in a virtual world.
Yes, there is a "you" who went to the facility and then went home. There is also a "you" who went to the facility and subsequently blacked out only to wake up later somewhere else. An outside observer sees you walk in, a scan made, and you walk out. From the perspective of the post-original, you walked in, scanned, then walked out. From the perspective of the copy, you walked in, blacked out, and then woke up later.
I guess my point is just that when you refer to yourself pre-cloning, you have to realize that you're speaking (and thinking) for the copy as well. It's fun to think about. Makes for great sci-fi.
Yes, but it will create a new sensation of identity on the copy, which would be equal but separate to yours. That's the gist of the point. See my other answers around this subthread.
I only object to the privileging of "your" consciousnes. I don't know how to express it. Yes, of course when the copy is made each person-thread evolves on its own. My point is that you can't distinguish one being "the original" and the other "the copy".
Just imagine that, in the normal course of things, a person at time t is constantly being copied into a slightly different new person at t+dt and then destroyed. There is a causal connection and thus an inheritance of memories, but not single "essence about yourself" being conserved.
With current technology we only have a directed linear graph, like o -> o -> o -> ... . With uploading tech this graph will fork into two paths, but neither of them can claim to be "the original", because there is no such thing.
I mean, maybe this concept of "personal identity" is simply wrong. It's an illusion because currently threads of causality on people are linear, but that's just a technological limitation.
It's very rational for you to avoid privileging any single one of my copies, but for me it's very rational to privilege my own instance. And likewise, I wouldn't care which one of your copies thrive, but I would expect that you would hope your instance is the one that survives.
And I don't see how linearity has to do with self-preservation instincts. Maybe you are arguing that once we have several diverging copies the self-preservation instinct will change and be content as long as one copy still survives? I just can't imagine it.
Let's say we can copy me. As in, we make another me (let's call him me* ) that is physically identical to me in every possible way.
I believe your argument is that I will continue to be me, and me* will essentially be a different person. If I die, that's it. me* lives on, but I am not me* , I am me.
The interesting question, which your parent (in the thread) posits an answer to, is what is the actual difference between me and me* ?
If you believe that humans exist entirely within the physical world (as in, there's no mystical/religious/spiritual element to our existence--we're just matter and energy), then me and me* are no different. Suppose we go back to your idea, that we kill me after creating me* . Since I am purely a physical being, I have no way of distinguishing between me and me* . In particular, if you never told me that I am me* , I would have absolutely no way of distinguishing the difference between me and me* .
Wouldn't me and me's consciousness diverge after a few hours experiencing different things in different places? If so, would me agree to suicide knowing that me is still alive?
If by consciousness you mean introspection or self-modeling or something like that then consciousness is definitely real. But it's just a property that some minds have, and that someday will probably be analyzed and replicated in silico.
But if you mean some kind of special conserved "essence" that makes you "you" in addition of your memories... then yes, it's probably just an illusion.