- cross-posted to:
- programmer_humor@programming.dev
- cross-posted to:
- programmer_humor@programming.dev
I think SOMA made it pretty clear we’re never uploading jack shit, at best we’re making a copy for whom it’ll feel as if they’ve been uploaded, but the original remains behind as well.
A lot of people don’t realize that a ‘cut & paste’ is actually a ‘copy & delete’.
And guess what ‘deleting’ is in a consciousness upload?
I mean, if I die instantaneously and painlessly, and conciousness is seemingly continuous for the surviving copy, why would I care?
My conciousness might not continue but I lose consciousness every day. Someone exists who is me and lives their (my) life. I totally understand peoples aversion to death but I also don’t see any difference to falling asleep and waking up. You lose consciousness, then a person who’s lived your life and is you regains consciousness. Idk
You make a good point. We all might be being copied and deleted in our sleep every night, for all we know.
There’d be no way to know anything even happened to you as long as your memory was copied over to the new address with the rest of you. It would be just a gap in time to us, like a dreamless sleep.
Old post but…if it’s just memory, you’d lose ttauma and other ingrained coping mechanisms, no? There’s no brain to try and fight back against things. Just memories making you…you…? Or not you, if you oose some of your behaviors?
Most people don’t like the idea of a suicide machine.
Yeah, and I completely understand that. Just from a logical perspective though, lets say the process happens after you fall asleep normally at night. If you can’t tell it happened, does it matter? I’ve been really desensitized to the idea of dying through suicidal ideation throughout most of my life (much better now), so I’m able to look at it without the normal emotional aversion to it. If teleportation existed, via this same method, I don’t think I’d have qualms about at least trying it. Certainly wouldn’t expect other people to but to me I don’t think it’s that big a deal. I wouldn’t do a mind upload scenario, but moreso due to a complete lack of trust in system maintenance and security, and a doubt that true conciousness can be achieved digitally. If it’s flesh and blood to flesh and blood though? I’d definitely try
its the transporters all over again.
It’s all good as long as you’re always on the better side of the coin flip.
I wonder how you ever could “upload” a consciousness without Ship-of-Theseusing a Brain.
Cyberpunk2077 also has this “upload vs copy” issue, but doesn’t actually make you think about it too hard.
That’s what I’ve always thought more or less, to have a chance you would need a method where mental processing starts to be shared in both, then transfers more and more to the inorganic platform till it’s 100% and the organic isn’t working anymore.
The animated series Pantheon has a scene depicting exactly this, and it’s one of the most disturbing things I’ve ever seen.
Edit: Here is the scene in question. It’s explained he has to be awake during the procedure because the remaining parts of his brain need to continue functioning in tandem with the parts that have already been scanned.
Interesting but I would argue that’s actually still a destructive copy process. “Old Man’s War” did a good job of what I’m talking about, it was body to clone body but the principal was similar and at the halfway point the person was experiencing existence in both bodies at once, seeing both bodies from the perspective of each other until the transfer completed and they were in the new body and the old slumped over.
That also reminds me of this scene from Invincible where during the copying process their experiences are sort of “blended” making them see from both bodies at once, only here they both live and are separate afterwards.
Edit: is it obvious how much of a sci-fi geek I am lol
You would have to functionally duplicate the exact structure of the brain or its consciousness while having the duplication mechanism destroy the thing it was reading at almost exactly the same time. And even then, that’s not really solving the issue.
I don’t see an issue with that. A prolonged brain surgery that meticulously replaces each part with a mechanical equivalent in sequence. Could probably remain conscious the whole time.
Yeah, but it’s still a Ship of Theseus problem. If you have a ship and replace every single board or plank with a different one, piece by piece, is it still the same ship or a completely different one, albeit an exact replica of the original. It’s important because of philosophical ideas around the existence of the soul and authenticity of the individual and a bunch of other thought-experimenty stuff.
I think so long as you maintain consciousness that issue is fairly null in this particular circumstance. There’s lots of tolerance for changes in thought while maintaining the same self, see many brain damage victims. So long as there is minimal change in personality, there are lots of other circumstances that have a stronger case for killing one person and having a new person replace them due to change of consciousness, imo, I don’t think most people would consider a brain damaged person killed and replaced by a new consciousness, or a drug addiction with radically altered brain chemistry, etc.
Not necessary. Imagine you begin suffering Alzheimer. And the artificial neurons are making a copy of your brain. Once a neuron stops working the backup one replaces it. Your mind, if it worked, could see the new neuron as part of the same brain and work with it seamlessly.
Yeah, like replacing individual braincells with more durable mechanisms. Idk, maybe they would be cellular as well. …that makes me wonder, maybe it is possible to transfer consciousness even with traditional biological mechanism?
Any sufficiently identical copy of me is me. A copy just means there are more me in the universe.
reproduction 101
That ending screwed with my mind. Existential horror at it’s finest!
I was just annoyed at the protagonist for expecting anything else. The exact same thing already happened 2 times to the protagonist (initial copy at beginning of the game, then move to the other suit). Plus it’s reinforced in the found notes for good measure. So by the ending, the player knows exactly what’s going to happen and so should the protagonist, but somehow he’s surprised.
Yeah true. But Catherine said it perfectly at the end. Something like “you still don’t get it? What did you expect?”. The fact that one of his consciousness remains down in the abyss was kind of frightening. All by himself.
Two actually. The one from the before the suit change is also left there, and Catherine said he will wake up in a day or two. Maybe they can meet up actually.
You didn’t kill old suit you? Cruel.
Ahh, but here’s the question. Who are you? The you who did the upload, or the you that got uploaded, retaining the memories of everything you did before the upload? Go on, flip that coin.
If you are the version doing the upload, you’re staying behind. The other “you” pops into existence feeling as if THEY are the original, so from their perspective, it’s as if they won the coin flip.
But the original CANNOT win that coinflip…
But like… do I care? “I” will survive, even if I’m not the one who does the surviving.
I can’t speak for anyone else, but I would. The knowledge that “A” me is out there, somewhere, safe and sound, is uplifting, but it’s still quite chilling to realize you are staying wherever the hell you are. At least we die after enough time has passed because our bodies decay.
onthulling
The SOMA protagonist wasn’t that lucky…
Is it chilling? I was already going to stay where I am, whether I made a copy or not. Sharding off a replica to go on for me would be strictly better than not doing that
I think it’s both for me, which I think is what you might be saying as well. I would absolutely push the button to create the copy, or whatever, because I think I would derive satisfaction from creating a life (identical to mine, no less) that was free of the circumstance I was in, which must have been dire. However, I definitely don’t consider that instance “me” even if I do consider the copy a legitimate, separate version of “me”, so I don’t feel that I have perpetuated my own instance, leaving me in whatever fight-or-flight terror I was in to cause the scenario in the first place.
What do you mean he wasn’t so lucky, after all he lived out his live in Toronto. That he did a brain scan at some point of his life doesn’t matter. Sucks for the robot who thought he was him.
which instance of theseu’s ship am I?
Implementation will be
{ // TODO return true; }
It’s still a surviving working copy. “I” go away and reboot every time I fall asleep.
Why would you want a simulation version? You will get saved at “well rested.” It will be an infinite loop of put to work for several hours and then deleted. You won’t even experience that much, your consciousness is gone.
Joke’s on them, I’ve never been “well rested” in my life or my digital afterlife.
Yup. Read this and you will forever be against brain-uploading:
If anyone’s interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it’s created and used by big tech companies is uncomfortably real.
The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can’t help but wonder if it’s at least partially because of its harsh criticisms of the tech industry.
Upload is also good.
I get this reference
Well yeah, if you passed a reference then once the original is destroyed it would be null. The real trick is to make a copy and destroy the original reference at the same time, that way it never knows it wasn’t the original.
I want Transmetropolitan style burning my body to create the energy to boot up the nanobot swarm that my consciousness was just uploaded to
I think you mean
std::move
get your std away from me sir
I dunno. I could be quite happy having brain children or as a copy of a consciousness at a given point in time.
You see, with Effective Altruism, we’ll destroy the world around us to serve a small cadre of ubermensch tech bros, who will then somehow in the next few centuries go into space and put supercomputers on other planets that run simulations of people. You might actually be in one of those simulations right now, so be grateful.
We are very smart and not just reckless, over-indulged douchebags who jerk off to the smell of our own farts.
Sure, some probably do. And you can be sceptical and discuss why that’s a dangerous and undemocratic direction. Effective Altruism is a question, not an answer. In thr community, asking for and being open to critical feedback is encouraged as the main tenet of good culture.
But if you look at the amounts, most EAs donate most to helping the poorest people alive today. Because it is so obviously good, and proven to work with high certainty.
If you are interested in learning more about Effective Altruism, check out https://www.effectivealtruism.org/articles/introduction-to-effective-altruism
Source for distribution of donations: https://80000hours.org/2021/08/effective-altruism-allocation-resources-cause-areas/
Do you really think that’s what effective altruists want?
You really gotta look at their actions. What they say they want and what they show us they want are clearly two different things.
Do you think that your assertion, that they want to destroy the world around us in order to provide “value” to a small group of tech bros is at odds with the underlying philosophy of effective altruism? It seems like anyone who wanted to create the most good for the most people would be opposed to a future like that.
There are many languages I would rather die than be written in
Glad that isn’t Rust code or the pass by value function wouldn’t be very nice.
Borrow checker intensifies
In a language that has exceptions, there is no good reason to return bool here…
Result<_>
HRESULT
Literally the plot twist in…
spoiler
Soma
A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy
public static Consciousness Instance;
What if every part of my body is replaced by computer part continously. At what point do I lose my consciousness?
I think this question is hard to answer because not everyone agrees what consciousness even is.
It wouldn’t really matter until you get to the brain. Very little of your body’s “processing” happens outside of your brain. Basically all of your consciousness is in there. There are some quick nerve paths that loop through your spine for things like moving your hand away when you touch a hot object, but that’s not really consciousness.
The Closest-Continuer schema is a theory of identity according to which identity through time is a function of appropriate weighted dimensions. A at time 1 and B at time 2 are the same just in case B is the closest continuer of A, according to a metric determined by continuity of the appropriate weighted dimensions.
I don’t think that I fully agree with it but it’s interesting to think about