The Rapture of The Pretty Hip People, Actually

by Belle Waring on May 1, 2017

No spoilers because I’m just talking in generalities. Read away.

Walkaway is a book in which important issues about how we should live, and how we can live, are discussed and hashed out very thoroughly. Not anywhere near the level of Kim Stanley Robinson, when in the course of reading you are inclined to ask, “did I just read 160 pages of minutes from an anarcho-syndicalist collective meeting? Yes, yes I did. Huh. Why I am I finishing this trilogy? Oh right, I have a compulsive need to finish any book.” Nonetheless, the discussions are full and mostly quite satisfying even as they treat difficult issues. What do we owe one another in society? How should we distribute resources? (I will note in passing that there is a certain tension between the post-scarcity economy that seems to be available and the widespread poverty of the “default” world, but we can hardly expect a smooth transition from the one to the other; perhaps this is realism rather than inconsistency.)

However there is one topic which does not get as much of this treatment, in my opinion, even as it is a very live issue in the plot, namely, is a copy of you really you? If your consciousness could be uploaded to a computer and successfully simulated, would this represent a continuation of your actual self, or merely the creation of a copy of you, like an animated xerox? Would you “go on living” in some meaningful sense? What if these new copies of you were drafted as servants, to use the way we use machines now, but a thousand times more useful?

Secondly, if copies were created at various times, each faithfully representing you at that moment, would they all be you? Only the most recent? What if you were happier in the past and you chose backup 12 instead of 19 to be run in the event of your death? And what if you delete the other copies, are you murdering real people? Committing a peculiar multiple suicide?

To be fair some of these latter issues are explicitly talked about by the characters. Dis, a disembodied sim who is the first mind to be brought successfully to consciousness, knows that varying labs are performing varying experiments on the things she indifferently refers to as “copies of me,” “versions of me,” or “twins.”

There were real fuckups in Madrid who brought up a version of me, tried to make me help them. That copy suicided, after sending messages to all the other groups telling them about the evil shit going down. But Madrid’s the only lab that’s succeeded in bringing up a sim in a stable state. I’m thinking of giving everyone else permission to experiment with versions of me online, as creepy as this is. (p. 142)

Perhaps I should step back and restate this; in my opinion these philosophical issues surrounding preserving identity across upload are treated interestingly in the plot itself, from which readers can infer varying conclusions up to a certain point in the novel. However they stand in contrast to the issues around debt, optimal societal arrangements, whether massively disproportionate wealth can ever be justified, and the like, which receive a ton of explicit consideration between the characters, and which arrive at more unified conclusions.

But, paradoxically, the fact that a copy of you is meaningfully you seems to be fully accepted by the end of the book, despite Dis’s acknowledgement of the difficulty around varying simulations of herself, and the varying fates of the uploaded. This is particularly evident in the conclusion, in which some characters who have physically died and “lived” as sims for a time are re-embodied. We are definitely meant to think, “hooray, beloved characters are back and can go to the onsen” and not, “hmm, is this even meaningfully them?”

Obviously to some degree problems about the continuity of consciousness in upload are just an extension of ordinary philosophical concerns about identity. Derek Parfit constructed a thought experiment that encapsulates the issue when he considered what would happen if a person were divided in two, with each having the memories and beliefs of the original. Could one meaningfully be the original and the other a copy? He’s inclined to say no, or that it’s indeterminate. The question is more, do we care about this possible future self? Do we want it to go on, in the same way we want ourselves to go on?

Now, during the course of the novel itself almost all these issues are raised via events in the plot if not in open discussion. (Then what am I complaining about?) I’m not complaining, really, just noting first that, as John put it, Graeber-type concerns eclipse Parfit-like concerns when it comes to explicit dialogue. Secondly, while the events that befall our characters do seem to problematize continued identity across upload very seriously, the characters don’t seem worried enough about this issue, and the novels conclusion seems to take it for grated that this problem is solved in a simple and perhaps intuitive way: it’s definitely still you.

Without spoiling things I can tell you about one of the plot points that touches on this most poignantly: someone is taken for dead, and their sim is put up and running in multiple versions, and one of these becomes a kind of house genie for the other main characters. That the “real” person–and we as readers can only think of the meat one as the real one–is rotting away unknown while some version of herself is adjusting the air-con for her friends is painful. It reminded me of a moment in an Iain M. Banks novel in which I realized there was a Made Mind that existed only to maintain a small human habitat (like a studio apartment) on a gas giant where the human resident would otherwise be crushed or poisoned (or all that and lots more.) This seems like a poor job for someone who presumably has a brain the size of a planet. Or did they give it a smaller mind so it wouldn’t be bored, like the Deltas in Brave New World, content to sweep up?

Maybe my worries reflect a poverty of imagination about what it would be like to run your own brain, to smooth out the highs and lows or ride them out as you choose, to communicate continually with other machine intelligences. Maybe turning the lights on for the meat puppets occupies so little of your brainpower that it hardly matters, and the servitude is entirely illusory. Nonetheless I felt unsatisfied by the way all these problems seemed wrapped up nicely with a bow on them at the very end, especially when contrasted with the way the characters themselves wrestle convincingly with the social issues of the novel.

{ 18 comments }

1

Neville Morley 05.01.17 at 2:20 pm

Yes; and not just the possibility of restoring an earlier, happier version, but of modifying one’s own mental parameters in media res. Am I still me if I’ve tweaked my instinctive reactions so that I *don’t* freak out about being a bodiless duplicate of myself? I don’t think it’s a poverty of imagination, but a serious concern about locating the border between self and not-self, humanity and the post-human – and thinking that the distinction matters.

2

max 05.01.17 at 3:38 pm

I literally know nothing about the book you’re talking about. (So this isn’t about the book.)

Derek Parfit constructed a thought experiment that encapsulates the issue when he considered what would happen if a person were divided in two, with each having the memories and beliefs of the original. Could one meaningfully be the original and the other a copy? He’s inclined to say no, or that it’s indeterminate.

Asimov (I think) worked the same terrain in the context of the Star Trek transporter. The show tends to treat transporters as though they open some kind of tunnel and move matter across space from one place to another, but that’s obviously wrong from the standpoint of physics. Asimov was arguing that a transporter was really a duplicator creating myriad physical copies of the people transporters, raising the question of what happens to the destroyed original.

But that is all assuming there is an exact duplication of all the matter in the entire physical body. Which is not what’s happening here.

It’s not like nature hasn’t already done this experiment for us: identical twins are natural clones, yet they are different entities starting the womb, and are different (albeit very similar) people. They also tend to diverge over time. They aren’t exact copies of each other, just genetic duplicates, but they are not exactly the same, and no one would ever say they were the same person, unless they were confused.

their sim is put up and running in multiple versions, and one of these becomes a kind of house genie for the other main characters. That the “real” person—and we as readers can only think of the meat one as the real one—is rotting away unknown while some version of herself is adjusting the air-con for her friends is painful. […]This seems like a poor job for someone who presumably has a brain the size of a planet.

As far as I can tell, any copy of a person’s mind stored in some kind of a computer is a) almost certainly incomplete since it lacks the associated hardware/wetware to run on and b) would effectively have to be running on a some kind of emulated hardware/wetware/body (because if it didn’t your mind would simply cease to function) and c) would be able (if emulated properly) to run at computer-like speeds.

I don’t see how a computerized duplicate is in any way ‘the same person’ and thus ‘you’.

I could invoke all the people certain that we live in a simulation, and thus a person could be duplicated by the entities running the simulation, but then we’d be talking about an emulation running on computer hardware from a simulated universe. If you duplicated the entire (simulated) universe, complete with simulated bodies, then we’d be talking about the copies meaningfully being the same person.

Maybe my worries reflect a poverty of imagination about what it would be like to run your own brain, to smooth out the highs and lows or ride them out as you choose, to communicate continually with other machine intelligences.

You’d be talking about a limited duplication evolving at computer-like speeds in a computer-based reality and I’d think the duplication would start out very different and evolve away from the original at very high speed under the effects of very different stimuli that would differ radically from the kind of meatspace stimuli experienced by a physical person.

What you’re describing sounds like an ultra-high tech autonomous answering machine tape. Particularly since it doesn’t seem necessary or particularly useful to turn a duplicate into a household controller.

max
[‘I don’t see how it would be ‘you’.’]

3

Belle Waring 05.02.17 at 2:52 am

Star Trek itself confirmed the transporter theory to an extent in TNG: a poor connection caused Riker to both be successfully beamed off the planet he was on and not beamed off, and stipulatively alt-Riker (more plausibly the real Riker if you think about it) was marooned for years before getting rescued by the Enterprise. He had been mooning over Troi the whole time while our Riker was out breaking her hearts and others. Though she returned his feelings he had to be shipped off to elsewhere in the fleet because it would have been really confusing even if you counted the pips or one of them got a shave or whatever. But nothing could more clearly indicate that a duplicate was being constructed aboard the ship druring the snow-globe sequence of the transporter’s operation than that the transportee can be perfectly well left at the other end. This was fudged by claiming the ‘beam had been split.’

4

MFB 05.02.17 at 12:48 pm

This all bothers me a very great deal.

1. An electronically simulated version of yourself could presumably be constructed by precisely simulating the electrochemical processes of the brain and the nervous system. It could then be infinitely duplicated. It should, then, be possible, by judicious tweaking of the sensory system, to compel that version, or even simply make that version, do anything someone else wanted you to do. And that would be the same as doing it to you, and you wouldn’t have any control over it, as far as I can see. This is supposed to be a utopian concept?

Since the post flings around a rather silly criticism of Iain M Banks’ Minds (any of Banks’ Minds, when interacting with humans, uses only a tiny part of its consciousness, since the point of the Minds is that they exist largely in other dimensions), perhaps it’s worth recalling that one of Banks’ last books posited a civilisation which deliberately created a simulated Hell for those citizens who hadn’t obeyed the Codes. How can Doctorow prevent this from happening? If not, how can this be a utopian text?

And yes, it is rapture for hip people. And rapture itself is a “dividing the sheep from the goats” concept, a form of universally institutionalised apartheid, in terms of which those who are not hip are forever doomed while those of us who have voted for the proper Presidential candidate ascend unto the heavenly places and are never used as electronic toilet-cleaners in the basement of cyberspace.

But in practice we could be. We probably would be. The thing about hipness is that some people are hipper than others, and the ones in command of the code of the simulated universe would have no difficulty declaring their own uberhipness.

5

bianca steele 05.02.17 at 1:55 pm

I like the points you make here. And the quoted paragraph makes me want to read the book! Philosophical discussions (I’ve tried to read Nagel but not Parfit) about these identity questions always make me wonder why the writers find them interesting, unless they’re tied to SF scenarios. I wonder why the philosophers don’t tie them in, whether there’s something deep I’m missing.

Also, you just resolved an argument my husband and I had just the other night! I had misremembered the episode (because when my father woke up in the middle he mumbled “Is that his brother or something?” and I had turned it into a story where he’d guessed the plot without seeing it) and I guess he was right.

6

sanbikinoraion 05.02.17 at 3:06 pm

I don’t really understand how anyone has a leg to stand on in the “I uploaded my consciousness to the internet!!!” stakes.

I mean, your actual consciousness is living in your actual head. If you create a copy, even a perfect copy, of your brain, and then vaporize yourself, then you have, in fact, still vaporized yourself. Yes, there’s another copy of “you” knocking around, but it’s pretty clear that the program running on the original hardware has been terminated. Experientially, the original “you” quite clearly dies. Has no-one round here seen The Prestige…?

I really don’t understand where the leeway for anything else is supposed to be.

7

Petter Sjölund 05.02.17 at 4:30 pm

“The program running on the original hardware” is also terminated every time you go to sleep or become unconscious. And as explained in Parfit’s book, there are already ways to more or less split a human consciousness in two (by severing the corpus callosum) where each half will have the same claim on being the “real me”.

The only thing that makes you the real you is the fact that you can remember your past experiences. If you start forgetting them, say, from senility, and the “copy on the internet” remembers more of then, then I’d say that the “copy” has a stronger claim on being you.

8

bob mcmanus 05.02.17 at 7:18 pm

6: Watch me read me right here now upload my consciousness to Crooked Timber in this comment. It is so hard for me, maybe impossible, to mark the time or place this ‘c’ is no longer connected to me, even since I will remember it after I shut down my browser. I am certainly more connected to that (?) ‘c’ than to my gall bladder. When I mention the arthritis in my typing fingers, those reading share it, and possibly even feel pain in their own fingers. The self is not a geometric space.

I am infamous for liking to quote. This is because I do not privilege the database in this meat puppet over wikipedia, and do not privilege “my own” experience over others. The abstract space of communication does not exist separately from the meat and metal space. Language is material.

Donna Haraway Cyborg, N Katherine Hayles Posthumanism, Gilbert Simondon Transhumanism. The Lotus Sutra.

9

sanbikinoraion 05.02.17 at 7:39 pm

> The only thing that makes you the real you is the fact that you can remember your past experiences.

No, the thing that makes you you is the set of atoms in your head, being you.

10

Petter Sjölund 05.03.17 at 8:20 am

Then you are not the real you. That set of atoms has been replaced many times over what you think of as you lifetime, not to mention the particles that make up those atoms. There’s no telling where the real you has gone off to.

11

sanbikinoraion 05.03.17 at 1:43 pm

Ha. The gradual replacement (hot-swap, too, note!) of my brain-cells is not the same thing as copying my entire brain state and running it immediately on a new device. My “me-ness” is not affected by the gradual swapping of matter, because the statefulness still remains running on the same device in my head.

Categorically, imaging my brain state and starting off a new instance does not result in my currently active consciousness somehow “hopping” to the new form. The new form would experience it as if that were the case but it is, in fact, not the case.

12

Yankee 05.03.17 at 4:24 pm

When we’re talking about continuity of identity, that supposes that the chain of experience is the thing we should follow. All those sims are having different experiences (“they” are portrayed as messaging, although I suppose something more borg-like would be … conceivable.) The experience of having a body composed of “sensors” and power supplies and broadband connections seems sufficiently “different” to qualify a sim as a distinct person, and how you would stuff that experience back into a meat package is … not conceivable.

Analogize to one pregnant woman becoming two people. The infant person commences individualized experience in a radically altered physical instantiation.

… I got halfway through the book and got tired of strawmen with hidden virtues jumping out of closets. I do envy people who read fast enough that they can afford a compulsion to finish everything.

13

Yankee 05.03.17 at 4:43 pm

I was going to say, the ST transporter results in a clearly continuous experience; the ick issue is whether it’s edifying to make a distinction between the accumulator of experience and the accumulated experience. I would think yes, although Star Fleet is pretty cavalier about it.

(note, everybody’s experience is a worldline embedded in the 4D fabric of spacetime and can therefore be said to be “permanent” although every body will cease being active and return to quarks eventually.)

14

sanbikinoraion 05.04.17 at 2:29 pm

Think of it another way: offered the choice to travel from London to New York instantaneously, you’d take it, right?

But what if someone offered you the choice of having your entire body cloned, and for the clone to be made in New York (assuming you’re in London), and that you will be immediately vaporized.

How many people do you think would take that option…?

15

Moz of Yarramulla 05.05.17 at 3:41 am

If nothing else it would conclusively disprove the possibility of souls existing, at least in the “one per person” sense. I can imagine some religions not dealing well with this and that causing political problems even outside nominal religious states.

I’ve been thinking more about the ethics of instantiating multiple physical copies for the purposes of experiment. At the trivial level, I’d like to go bush and build my own off-grid haven. Being able to do that while also staying in the city would be interesting. But much more ethically/philosophically interesting would be duplicating myself to see whether an experimental medical treatment works would be extremely handy. It would also be hard to argue against even quite extreme experiments on the usual ethical grounds – I think I should be allowed to consent to an experiment that would certainly kill me, since at least one copy would still exist. So medical research would possibly get a boost, modulo the medical advances required to duplicate me in the first place (but again, the book uses magic so doesn’t have that problem). Either during development of the copying process, or shortly afterwards, I would expect extremely reliable psychiatric interventions to be commonplace.

A philosophical complexity that arises is the extent to which I can pre-consent to the creation of a modified (“fixed” or “improved”) copy of myself, and what obligation (if any) that copy has to me after creation. That’s even ignoring the tricky legalities (makes the multiple marriage thread here seem trivial by comparison – are two copies of me still married to two copies of my husband?). But if I make a copy with, say, gills, is that copy obliged to even consider my idea of moving to a Pacific Island and becoming a marine biologist? What if I-with-gills decides that I-with-lungs made a horrible mistake, and that I-with-gills shouldn’t exist (or worse, that I-with-lungs shouldn’t).

16

Moz of Yarramulla 05.05.17 at 3:44 am

Oooh, ooh, can I disown myself? What does that even mean? Who’s liable for my debts? If I make a copy, then the original commits suicide, is the debt inherited by the copy? What about the assets? I foresee a thriving trade in strategic dying amongst the wealthy unless that is sorted out very clearly (viz, strategic bankruptcy is already an issue).

17

bianca steele 05.05.17 at 12:24 pm

I’d like to correct my comment @5, if I may. I read, or tried to read, Nozick on this, not Nagel.

18

Matt 05.06.17 at 9:06 am

I just finished the book and I love almost every question it raises but roll my eyes violently at almost every answer it suggests. In the case of the brain emulations especially I can’t tell if Doctorow has accidentally introduced a wildly overpowered device to solve the immediate crisis and then neglected all its higher order effects, like the worst of Star Trek episodes. Or maybe the wildly overpowered device used to solve a quotidian problem and nothing more is a deliberate joke, like in a good Futurama episode.

Immortal-emulation Dis, speaking to imprisoned Natalie:

“We pwned this place as soon as you went. It was Gretyl’s project, but I did the heavy lifting. We used like seventy percent of walkaway’s compute-time running parallel instances of me, at twenty ex realtime. We clobbered the fucking IDS, smoked the firewall, and now I’m so deep I can do anything.”

Scrappy misfits using crumbs of compute power scrounged from Default’s trashcans summon a weakly superhuman successor to Homo sapiens sapiens. And use it mostly to keep one person company while they make a plan to break her out of captivity. The unstoppable faster-than-human AI swarm constructed as a tool for a prison break is a banana peel obviously placed on the sidewalk here. It’s multiple banana peels. One placed by Peter Watts, one by Hannu Rajaniemi, one by Ken MacLeod… and the story just steps over them all. I can’t tell if Doctorow is joking or ha-ha-only-serious.

I don’t think that whole brain emulation is forbidden by physics or souls or anything, but I don’t think it can happen soon. Or maybe ever. By way of analogy: real physical processes can produce heavier elements from light ones like carbon and oxygen. But if someone says that humans will eventually solve global warming by fusing CO2 into stable metals, imma eyeroll.

So I’m already suspending a fair bit of disbelief to countenance whole brain emulations in 70 years or so; it’s a suspension too far to also imagine that you can introduce them without profound effects that swamp merely human-scale drama. It becomes a story necessarily set in the posthuman era. And telling a coherent story about posthumans is as difficult and pointless a feat as someone from the 18th century telling a story about the NSA hacking Petrobras. No author can imagine alien futures that clearly and even if they could no readers from the same era could understand it.

Comments on this entry are closed.