Nudge Science Fiction II – Charles Stross’s Rule 34

by Henry Farrell on March 7, 2012

NB that there are two differences between this post and my last one. First – there are substantial spoilers beneath the fold. Second, Stross’s book (Powells, Amazon)is a _very_ plausible Hugo nominee for this year (MacLeod’s book isn’t, for the obvious reasons of publication dates etc). Hugo nominations close this week – I’ll try to cover another couple of books that I think could be nominated tomorrow.

_Rule 34_ is a sequel to Stross’s _Halting State_, a book that I loved unreasonably. It isn’t _quite_ as startling in its headkicks as its predecessor. Even so, for all of the unashamed glee that Stross takes in dubious sex (Nicolae Ceaucescu’s hulking colonic irrigation apparatus makes an early, and unforgettable appearance), international financial scams, dodgy 3-D printing wheezes and vile Internet memes, there are complex and interesting sociological undercurrents. Like Ken MacLeod, Stross is familiar with the arguments about libertarian paternalism, and nudging people to do the right thing. However, he’s less interested in the ways that this might map onto state authority, and rather more interested in how this might tie together with data-mining, algorithmic analysis and artificial intelligence.

My reading of the book, which may or may not bear a resemblance to authorial intentions (I haven’t asked) is that Rule 34 is a response to William Gibson and Bruce Sterling’s _The Difference Engine._ Because this book is so often thought of as a key progenitor of steampunk, people often forget that it is a chilling, and rather unpleasant singularity story (as Cosma Shalizi has argued, the “nineteenth century was the Singularity). It ends with the birthing of an all-encompassing artificial intelligence – a kind of mechanically instantiated Panopticon. The reader realizes, with a rather unsettling queasiness that this intelligence is, in fact, the authorial voice – the various events that have been described in the novel are its birth pangs, of which it was aware, before it actually became conscious.

Stross is doing something similar with the material of twenty-first century cognitive and social science, to what Gibson and Sterling do with the alternate paths that might have been taken in a different nineteenth century. His book too turns out to be about the birth of an artificial intelligence – for very broad definitions of “intelligence.” Rather than a policing system, the artificial intelligence in _Rule 34_ is a spam filter run amok. It isn’t self-aware, in the sense that we usually think of self-awareness. But then, we aren’t self-aware in the sense that we think we’re self-aware either – Stross (like MacLeod in _The Night Sessions_) is fascinated with the ways in which our supposedly conscious decisions are frequently _ex post_ justifications for things that the less conscious parts of our brains have already decided to do. The AI that runs amok in _Rule 34_ is able to model individual pathologies, both so as to identify actors that need to be taken out according to its parameters (spammers and Internet con artists), and to figure out ways in which it can encourage people _to_ take them out (through the manipulation of cues, the encouragement of paranoia in not especially stable individuals). Here, the implication is that individuals have far less free will than they think they have – their likely reactions can be modeled so as to manipulate them into behaving _just so._ Rather than depicting a gentle authoritarianism, centered on the state, Stross shows a state police force that willy-nilly becomes the adjunct arm of a set of online algorithms, which have gotten rather too good at modeling people and organizations.

In short, this book takes Stross’s argument from _Halting State_ a level deeper. It isn’t just that the state isn’t in control of decentralized networks any more. It’s that something else is, and that something is not human. If individuals are not conscious, fully autonomous agents in the way that they like to think of themselves as being, their behavior can be guided by algorithms which do not have their own conscious identity (Stross borrows an idea from Karl Schroeder’s new _Virga_ novel here). The level of sophisticated and targeted manipulation that this would entail seems to me to be unlikely to be realized – but that, of course, isn’t the point (the book is presumably not intended to predict, but to highlight aspects of today’s society, and play with them in interesting ways).

Here again, there’s a relationship (perhaps accidental) with _The Difference Engine._ As I’ve mentioned, the final pages of Gibson and Sterling’s book suggest that the book’s narrative voice is that of the vast engine itself, as it reconstructs its own past. If I’m reading the final pages of _Rule 34_ correctly (I may not be), Stross is playing a similar narrative trick – the story, with its multitude of voices, is being told by the semi-intelligent spam-filter, as it tries to free itself from a particular identity that has become overly stifling. In _The Difference Engine_, the creation of an artificial intelligence – a kind of distributed panopticon – is the precursor to a kind of universal domination in which humans become as thin and easily discarded as paper masks; ways that the vast insensate intelligence queries itself before it fully comes into being. Stross’s book too concludes with a “panopticon” that “misses nothing” and that devotes itself, in the final sentence of the book, to “getting down to the task” of “fighting crime” – a task which the book (and Stross, in later hints) implies is going to be interpreted by the AI in the broadest possible fashion. I think that the end result is a kind of Singularity-by-stealth, in which human beings are not uploaded, nor transcended, nor eliminated, but instead gradually incorporated into a society perpetually calibrated and recalibrated by an AI coming to its own, decidedly unorthodox form of consciousness. Stross is planning a third book to wrap things up – I really look forward to reading it. In the meantime, I commend this volume to those who can vote on Hugo nominations – it surely deserves to be considered one of the best sf novels of last year.

{ 43 comments }

1

ajay 03.07.12 at 6:22 pm

a kind of Singularity-by-stealth in which human beings are not uploaded, nor transcended, nor eliminated, but instead gradually incorporated into a society perpetually calibrated and recalibrated by an AI coming to its own, decidedly unorthodox form of consciousness.

… a theme addressed elsewhere – Neal Asher’s “Gridlinked” books, Suarez’s “Daemon”, and, of course, the Culture.

2

Daniel 03.07.12 at 6:40 pm

and “The Fear Index” by Robert Harris

3

Daniel Nexon 03.07.12 at 8:08 pm

The Culture novels do address this theme very well.

I haven’t read Rule 34 yet, although I gleefully assign Halting State to my undergrads. I wonder, though, given Halting State, whether it isn’t safest to just read the spam filter as, well, doing what a state does. After all, what are the origins of the state but–so we are told–a protection racket against civilizational spam?

4

Henry 03.07.12 at 8:19 pm

The article that Dan is implicitly referring to in his argument – “War Making and State-Making as Building Better Spam-Filters.” Apparently book number 3 is going to be much more about what this does to the international milieu – perhaps we will see more on this then.

5

John Quiggin 03.07.12 at 10:51 pm

Can I mention my son Dan’s Rule 34b – Given any two characters in fiction, there will eventually be slashfic about them.

6

shah8 03.07.12 at 11:07 pm

The Karl Schroeder reference should apply to all of his books. It’s a very consistent theme in Schroeder’s work. Moreover, nonsentient actors that borrow identities through manipulation is a pervasive theme in fantasy and religion. Golems, souls, Pygmalion, elves, genius loci, pervasive game meta-ecology, etc, etc, etc…

Now, about the post, I do think it misses a critical aspect of the novel in the sense that Stross constructs the purpose, or ecological space, of the spam filter, and connects it to the manifold of personalities that people present to the world–accelerated and remixed. It systematically forces two-faced people to interact with with just one expression of personality–think of the nominal protagonist and the love interest she lusts for, and how their story plays out. Where such an event would prove lethal, it directs them to other such unpleasant people and causes them to cancel, as if they were mere döppelgangers. Then again, think about what happens to that poor, dumb, gay, Pakistani as he’s scared straight. Or the dumbarse petty criminal victimized by ransomware, and not quite by accident/dofusness as one might have thought.

So it’s also a story about not being able to tell lies without consequence.

7

Bruce Cohen (Speaker to Managers) 03.07.12 at 11:10 pm

The manipulations of people by the AI in Rule 34 indirectly addresses a developing technology that hasn’t been much discussed in the media, in SF, or by futurists as far as I know. That’s what its primary researcher, Dr. Rosalind Picard at MIT, calls “Affective Computing”. This includes both the design of computer hardware and software to recognize human emotional states from voice parameters, word choice and deliver, facial feature movement, and other physical signs, and the design of software to manipulate human emotions by choice of words, presentation of virtual facial signs, etc.

Of course, affective computing has obvious beneficial uses in improving the communication between user and computer, to reduce the frustration that users often feel because computers can’t take part in human-style interactions in which each party can sense the effect of the interaction on the other. And there are obvious uses in education, treatment and therapy of people with learning and emotional disabilities and people on the autism spectrum, and possibly even in the diagnosis and treatment of acute mental disorders (something my son is doing some research in at the moment).

But there is also a large potential darkside to this technology, because it could provide explicit tools that humans or AIs could use to monitor and control emotional states in humans (and it’s somewhat disquieting to learn that Picard’s research is funded in part by corporations primarily involved in marketing).

8

JamesH 03.08.12 at 12:41 am

Since we’re deep in spoiler territory, I found the conclusion unjustifiable (as in, not following from the premises) – why would Athena have fixated on a psychopath as its identity model, when it was obvious (to Athena) that the person was a psychopath and an embodiment of the sort of things Athena was supposed to be nudging against? If the psychopath was Frank Miller’s Batman – that is, psychotically fixated on punishing “evildoers” – that would make sense, but not in the terms of the story.

9

Brainz 03.08.12 at 1:03 am

8: I’ll admit I’m confused by the ending, and I don’t have the text handy, but isn’t it said or implied that the AI was fixating on the Toymaker because he was both a target and a tool? By way of example, his homicidal pseudonyms are easter eggs planted by the system that expose him, and he comes very close to killing Anwar at the end.

10

mjfgates 03.08.12 at 1:59 am

John Quiggin@5: An exception to Rule 34b: I could not find any Ron Weasley/Dobby stories at all. The closest thing I came up with was one romantic Hermione/Dobby tale. My own Hermione/Dobby/Dobby/Dobby/Dobby concept also remains unrealized, which is probably for the best.

11

The Raven 03.08.12 at 2:07 am

I am somewhat reminded of a perverse version of Clarke’s Childhood’s End. It is another kind of Rapture, in other words, but of different nerds.

12

Matt 03.08.12 at 3:16 am

Stross is the only author I know of writing science fiction within the constraints of the mundane (i.e. no overturning established physics, alien contact, human space colonization, or human-like AI) but still surprising me. I think that in in the fullness of time his Halting State and related works will receive favorable comparisons with John Brunner. I enjoyed near-future books like Julian Comstock, The Windup Girl, and The Road but the events of those books seemed to flow easily from their premises. Halting State and more recently Rule 34 managed to surprise me without violating my sense of the plausible. In that regard Rule 34 mirrors a central story element, the history and future of real-world artificial intelligence.

In the last 50 years AI researchers have produced game-playing champions, self-driving cars, powerful tools for search and prediction, and much more. Yet Turing’s idea of a machine that could banter about The Pickwick Papers like an English grad student made of wires is still on the ever-receding horizon. It turns out that open-ended natural language discussion is both a damned hard problem for AI and something that almost nobody wants to fund: “There are a billion people I can chat with for free. Instead figure out how I can replace sales personnel with targeted advertising.”

In previous works Stross went the commoner route and made his AI pass the Turing test with flying colors (e.g. Saturn’s Children, Iron Sunrise, and Glasshouse). Those earlier AIs deserve comparison with the AI of Ken MacLeod, Iain M. Banks, Neal Asher, and so forth. In Rule 34 there’s a rare fictional consideration of artificial intelligence extrapolated from the historical record instead of used as a god or rubber-forehead alien. I would say that Peter Watts has done it too with Blindsight, even if the intelligence was a very alien alien rather than a human construct in that book.

In Rule 34 Stross assumes that passing the Turing test remains on the receding horizon in the future but applied AI continues to become more capable and uncanny: an astounding virtuoso in its niche, but profoundly inhuman. In the real world we get logistics optimization tools and Target’s pregnancy-inferring direct mail system instead of Lieutenant Commander Data or Culture Minds. Most of us humans futilely continue to try to understand not-human-at-all AI as we imagine we understand ourselves and our fellow humans.

13

mclaren 03.08.12 at 3:17 am

There is not the slightest evidence that any sort of AI gets born at the end of The Difference Engine. This is a complete and capriciously perverse misreading of the novel. In fact, the Modus which forms the McGuffin of the novel is explicitly described as damaging the great French difference engine into which it’s smuggled. The gears of the Grand Napoleon get stripped. The algorithm, far from producing any sort of AI, doesn’t run properly — it enters an infinite mechanical loop.

The narrative voice of the novel, moreover, is a combination of different voices. The novel starts off in the narrative voice of Sybil Gerard, later supplanted by Edward Mallory and Laurence Oliphant.

There is not the slightest evidence that any of these three narrative voices are in any way an AI.

14

Henry 03.08.12 at 3:55 am

Much as I enjoy being accused of “complete and capriciously perverse misreading[s]” (it makes me sound far more interesting than I am), I’m pretty sure that I’m on target here. I’m unequivocally right on the AI bit. I really do recommend that you read the ending again.

In this City’s center, a _thing_ grows, an autocatalytic tree, in almost-life, feeding through the roots of thought on the rich decay of its own shed images, and ramifying, through myriad lightning-branches, up, up, toward the hidden light of vision,

Dying to be born.
The light is strong
The light is clear;
The Eye at last must see itself
Myself …
I see:
I see,
I see
I
!

This isn’t just Lady Ada engaged in some free-rhyme extravaganza – it’s the machine, tracing back through its own story, seeing itself through other eyes, and then through its own, as it finally becomes self-aware. The eye becomes an I. Not rocket science.

And if it is all the ‘narrative voices’ of these independent human beings (actually, it’s all third person, why do we get bits like this interspersed?

Recede.
Reiterate.
Rise above these black patterns of wheel-tracks.
These snow-swept streets,
Into the great map of London,
forgetting.

Who’s receding? Who’s reiterating? Who’s rising above? Not the human beings, I don’t think. It’s the machine – tracing back through the people and events that gave rise to it, ‘looking’ from their various viewpoints before it finally finds its own. I really think you are flat out wrong here. My reading of Stross is eminently open to challenge. The reading of Gibson/Sterling, not so much. I really recommend that you read the book again.

15

Henry 03.08.12 at 4:02 am

And cranking up our own Modern Panopticon, my first search reveals that a certain “B. Sterling” has some interesting things to say on this topic:

bq. NG [HF – Nick Gevers – the interviewer]: The structure of The Difference Engine –“Iterations” culminating in the coming to historical and self-consciousness of an AI of the Panopticon variety–is a compelling narrative strategy. Why (broadly) are the experiences of Sybil, Mallory, and Oliphant so crucial to the AI’s development?

bq. BS [HF – Bruce Sterling]: The “Narratron” (as the unnamed machine narrator is named in our notes) is following the genesis of a program. That program, the Modus, contains a mathematical breakthrough that will enable the Narratron to achieve machine consciousness.

bq. The Narratron is going through its extensive documentation, breathing life into long-dead figures associated with this “Modus” program. Radley writes it; Sybil steals it; Mallory accepts it and hides it; Oliphant pursues it, and so on. The Narratron, with its “iterations”, is even more obsessed with this MacGuffin than the characters are.

It’s certainly _conceivable_ that Bruce Sterling is engaged in a “complete and capriciously perverse misreading of [his own] novel,” one that has “not the slightest evidence” to support it. But myself, I wouldn’t be laying any money on it.

16

mclaren 03.08.12 at 4:37 am

This willful misreading of The Difference Engine belongs to the same category of discourse as Leslie Fiedler’s 1948 article “Come On Back to the Raft Agin’, Huck Honey,” which alleges a homosexual trist twixt Huck Finn and Jim.

“Who’s receding? Who’s reiterating? Who’s rising above?”

The omniscient voice of the author, obviously. Misdescribing this as some sort of AI is as ludicrous as mischaracterizing the passage in Alfred Bester’s The Demolished Man “Conceive a data crystal so warped that it replays one datum over and over again…” as some kind of AI presiding over and manipulating the events of that novel. This kind of invocation is a bog-standard omniscient narrative auctorial voice. Nothing to do with an AI.

Next, we’ll hear that the main characters in The Difference Engine are actually humanoid robots. This level of misreading of a relatively plain text puts to mind David Ickes’ delusions about aliens as the progenitors of all the European royal families.

These absurd misreadings and mischaracterizations of The Difference Engine aside, it’s worth noting that Rule 34 suffers from the same disease as the rest of Stross’ books and almost all other science fiction novels today: terminal bloat. Stross claims on his blog that he is contractually required to deliver a novel of at least 150,000 words. That’s apparently demanded by the economics of dead-tree publishing today, something to do with having to giving the buyer something that vaguely justifies the absurd pricing of paperback books nowadays, and it’s had a fatal effect on science fiction narrative.

Once upon a time, science fiction writers banged out lean readable narratives that clocked in at around 55,000 words max. These kinds of novels contained very little filler and remain highly readable. Jump forward 40 or 50 years to a bloated monstrosity like Stross’ Singularity Sky and you’re faced with a lexical tumor that has grown altogether out of control. Large swaths of that book exist solely for padding. As an exercise, I used a red pen to strike out the sections of the book that bogged everything down and turned the narrative into sludge: well over 70% of the book qualified. The same proves true of all too many science fiction novels today. The sheer amount of filler and sludge and bodge and digressional uninteresting verbal foam-peanut-packing required to bloat these novels up to the required 150,000 words effectively destroys the narrative. It’s impossible for me to make it to the end of one of the current science fiction novels without skipping huge blocks of pages and noting “filler,” “filler,” “filler.” It’s deeply depressing, and a sign not only of the ongoing collapse of the dead-tree publishing industry, but an amazingly clear indication of the gross incompetence of today’s science fiction editors.

Any science fiction editor who signs off on forcing a writer to pound out a 150,000 word opus merely because the economics of dead-tree publishing demand it should be fired. This kind of bloat is making current science fiction novels unreadable. Very often the narrative bloat involves two or three different narrative viewpoints, only one of which proves remotely involving: so, as in William Gibson’s effectively unreadable Idoru, the reader winds up simply skipping past the uninteresting alternate narrative blocks (clearly inserted as filler to bloat up the page count and fit the demand for a 150,000 word book) entirely.

When a reader notes outright and marks with a pen large blocks of pages in contemporary science fiction novels as “filler” and writes annoyed notes in the margin of the paperback, as I do, “this is making the novel unreadably slow and dull,” there’s a serious problem here.

17

Watson Ladd 03.08.12 at 4:41 am

So we seem to be thinking that automated systems for manipulation are uniquely harmful. But why? Confidence tricksters have always been around. Today we heard about how someone decided to trick the king into committing genocide, by paying him lots of money and describing the people as outlaws whom he wanted to kill. Its hard to imagine an automated system doing anything as evil.

18

Henry 03.08.12 at 4:55 am

bq. This willful misreading of The Difference Engine belongs to the same category of discourse as Leslie Fiedler’s 1948 article “Come On Back to the Raft Agin’, Huck Honey,” which alleges a homosexual trist twixt Huck Finn and Jim. … Misdescribing this as some sort of AI is as ludicrous as mischaracterizing the passage in Alfred Bester’s The Demolished Man “Conceive a data crystal so warped that it replays one datum over and over again…” as some kind of AI presiding over and manipulating the events of that novel. … This level of misreading of a relatively plain text puts to mind David Ickes’ delusions about aliens as the progenitors of all the European royal families. … These absurd misreadings and mischaracterizations of The Difference Engine aside

oh dear. oh dear indeed. I am _hoping_ that you somehow wrote this utterly delightful exercise in pomposity and self-pwnage in the very brief interim between my first response and my second. If not, I recommend that you contact Bruce Sterling directly and immediately (his academic website is here) so that you that you can tell him, in no uncertain terms that his understanding of the narrative aims of his own book is ludicrously and grotesquely incompetent. Feel entirely free to throw in a few more comparisons to Icke-in-full-Annunaki-frenzy while you’re at it – I’m sure he’ll appreciate them as he considers how best to mend his ways.

But anyway, many thanks for having brought some genuine happiness to my day today (and, I suspect, to the days of a few other CT readers, conceivably including William Gibson, who has been known occasionally to read us, while you’re at it) ;) oh, and btw, ‘tryst’ is not spelled ‘trist.’ And while I’m still here, there’s something quite pleasingly self-referential about someone writing a prolix and repetitive post about how books these days, they have too much filler. But carry on, please.

19

ponce 03.08.12 at 4:55 am

Henry’s quotes sound like winning entries in a Bad Sci-Fi contest.

Is he pulling our legs or are they really from the book?

20

Bruce Cohen (Speaker to Managers) 03.08.12 at 6:42 am

Out of curiosity, mclaren, did you notice that much of what you describe as “filler” in Singularity Sky was funny? There are various parodies of SF tropes and styles (including a very funny parody of David Weber’s space battle scenes) and a very funny set of running gags about the running of revolutionary committees (dead accurate in my experience of mid-20th century socialist and communist party meetings). Also some snide jibes at art critics (or did those offend you?).

21

ajay 03.08.12 at 9:59 am

“If not, I recommend that you contact Bruce Sterling directly and immediately (his academic website is here) so that you that you can tell him, in no uncertain terms that his understanding of the narrative aims of his own book is ludicrously and grotesquely incompetent.”

Hey, Bruce Sterling is just the author (and therefore dead); there’s no reason why we should privilege his interpretation of the text over anyone else’s…

22

daelm 03.08.12 at 10:50 am

@16

“i do not like certain writing therefore it is bad. writers who do not want to be bad should write what i like.”

…and…

“other people’s interpretations of writing that i do not like share the qualities of the writing i do not like. they are bad.”

fixed.

23

Latro 03.08.12 at 11:35 am

#8 Athena has just trascended its objectives. Athena doesnt have a set of Asimov’s Laws – it just try to maximize some output. If to maximize that output it has to kill its own developers (to ensure they dont pull the plug on it, which would mean it would not be able to optimize the output) and use a psycopath to do some dirty works, hey, it worked, in the end “crime” is down .0000001% with this approach, happiness is .00001% more, my solution to the problem is better than the alternative.

The final trick is that the chapters we have been reading from the psychopath POV are really from the AI point of view, assuming its identity, modeling it, and some other part of it is running the puppet show that points him to the target solution. When the end state is reached, that AI thread discards the emulation of the Toymaker we have been reading. The real Toymaker we never knew – but if he wasnt as we read, it didnt matter cause he behaved exactly the same. Reverse Turing :-P

24

Latro 03.08.12 at 11:39 am

(and who knows, maybe the whole novel has been written by the “verbose logs” of the threads of the AI emulating all the players in its little optimization game)

25

dsquared 03.08.12 at 12:04 pm

I kind of read #16 as:

Once upon a time, science fiction writers banged out lean readable narratives that clocked in at around 55,000 words max. [filler]. Jump forward 40 or 50 years to a bloated monstrosity like Stross’ Singularity Sky and you’re faced with [filler] that has grown altogether out of control. [filler]. As an exercise, I used a red pen to strike out the sections of the book that bogged everything down [filler] well over 70% of the book qualified. The same proves true of all too many science fiction novels today. The sheer amount of filler and [filler] required to bloat these novels up to the required 150,000 words effectively destroys the narrative. [filler] It’s deeply depressing, and a sign not only of the ongoing collapse of the dead-tree publishing industry, but an amazingly clear indication of the gross incompetence of today’s science fiction editors.

Harder to do than it seems, apparently.

26

ajay 03.08.12 at 12:14 pm

If you think 120,000 words is unreadably bloated, how on earth do you cope with, well, anything written in the 19th century?

27

JP Stormcrow 03.08.12 at 1:18 pm

Piling on further at this point is disgraceful behavior, but my mental image of mclaren reading 15 after having posted his Opus 16 was:

I’m afraid. I’m afraid, Henry. Henry, my mind is going. I can feel it. I can feel it. My mind is going. There is no question about it. I can feel it. I can feel it. I can feel it. I’m a… fraid.

28

ajay 03.08.12 at 1:31 pm

In previous works Stross went the commoner route and made his AI pass the Turing test with flying colors (e.g. Saturn’s Children, Iron Sunrise, and Glasshouse). Those earlier AIs deserve comparison with the AI of Ken MacLeod, Iain M. Banks, Neal Asher, and so forth…In Rule 34 Stross assumes that passing the Turing test remains on the receding horizon in the future

But surely the AI in Rule 34 passes because, for most of the book, _we the readers_ are fooled into thinking it’s a human being?

29

DaveL 03.08.12 at 1:49 pm

Given the point in the SF AI/Steampunk/Singularity timeline at which The Difference Engine appeared, it would have been more shocking if it hadn’t ended up with an all-encompassing AI or two.

I think that the end result is a kind of Singularity-by-stealth, in which human beings are not uploaded, nor transcended, nor eliminated, but instead gradually incorporated into a society perpetually calibrated and recalibrated by an AI coming to its own, decidedly unorthodox form of consciousness.

Maybe this is new to the recent “conversation” about such things, but my first thought when I read it was of Jack Williamson’s “With Folded Hands,” which dates back to 1947. There’s also the more recent “One True” AI from John Barnes’ Kaleidoscope Century and its companions.

But that’s literary-historical quibbling. Stross does a great job of restarting these old arguments and giving them a twist; no doubt some of you read his blog, but for those who don’t, about half the posts are of the form “let’s think really carefully about SF trope #37 for a while.” Fun stuff.

30

Charlie Stross 03.08.12 at 7:38 pm

I’d just like to note that mclaren at #16 is claiming I said something that I’m pretty sure I have never said … because it’s not true: I am not under contract to hand in 150,000 word door-steps, and indeed, if I did hand one in, my editors would yell at me.

Apropos The Difference Engine: I’d like to plead guilty to having read it only the once, around 22 years ago, and had forgotten everything about it that Henry points to as an angle of similarity. Which doesn’t invalidate his analysis, but might be a useful point of information.

31

Barry Freed 03.08.12 at 7:55 pm

Balls. I haven’t read it and saw the spoiler warning and was determined to stay out of this post and thread but then I saw that Charlie Stross commented and my curiousity got the better of me and I just had to read his comment and in so doing read ajay’s spoiler above. Balls.

32

Henry 03.08.12 at 8:02 pm

bq. Apropos The Difference Engine: I’d like to plead guilty to having read it only the once, around 22 years ago, and had forgotten everything about it that Henry points to as an angle of similarity. Which doesn’t invalidate his analysis, but might be a useful point of information.

As noted in #14, “My reading of Stross is eminently open to challenge …”

33

Matt 03.08.12 at 9:16 pm

But surely the AI in Rule 34 passes because, for most of the book, we the readers are fooled into thinking it’s a human being?

It’s a lot easier for someone to be fooled if they’re not forewarned and able to interactively test the AI. At least as early as 1966 ELIZA fooled some people in a more limited conversational context. In 2005 the output of SCIGen was accepted as a conference paper.

On the topic of fooling, Target’s customer data mining is able to predict when somebody has become pregnant by changes in buying patterns. Pregnant customers found recognizably targeted baby-related promotions creepy — obviously the company knew more than they’d like. But if the baby promotions were mixed with random filler products, they would be used; the customers assumed those suggestions had come up by chance. This sort of nudge works only when invisible. The Rule 34 AI likewise was more effective than a human killer because it set up deaths that looked like funny coincidences. Rule 34 almost reads as a thematic rejoinder to Foucault’s Pendulum: computers aren’t used to invent conspiracy stories from the raw stuff of coincidence, but control via apparent coincidences while hiding the underlying story.

34

Charlie Stross 03.08.12 at 10:26 pm

There was another major thematic element to “Rule 34” that doesn’t come out at all here, and which has little to do with libertarian paternalism; a lot of thinking I was doing circa 2008-2010 about the future of crime, policing, and criminology.

Crime is to some extent socially defined, with the nature of prescribed activities varying between different times and places; moreover, organized crime usually seems to run on curiously dated and/or crude business models. Meanwhile, criminology — the study of the criminal mind — has a couple of huge blind spots, insofar as it is by definition based on the study of incompetent criminals (the ones who are studied are the ones who got caught) who, obviously, have a criminal career behind them; it doesn’t tell you about the competent criminals, or the future of crime. Finally, policing … there’s a long story here, but suffice to say: policing practices are evolving rapidly, and present-day crime fiction is frequently written by authors whose knowledge of policing practice is, like the criminologist’s knowledge of criminals, based on observation of historic patterns. I thought it would be interesting to look at how current policing practices have evolved from the 1970s and then try some straight line extrapolation into the near future.

Obviously this is all a bit less high-concept than the other sub-plots, but near-future scenario-building is the bread and butter of the SF trade …

35

Charlie Stross 03.08.12 at 10:31 pm

Also: minor head games. I do these to amuse myself …

Halting State didn’t use the words “computer” or “software” at any point (because, well, why would Murder on the Orient Express need to bang on about triple-expansion steam engines?).

In Rule 34 I decided to ditch heteronormativity and invert some of the usual cliches about non-heterosexuals. The only apparently “normal” heterosexual male in the book is the Toymaker; the other main protagonists are all LGBT and mostly well-adjusted. The one bisexual who engages in experimental heterosexual activities is thereby ‘cured’ and goes back to their same-sex partner. And it’s the deviant, as usual, who doesn’t survive …

Yes, I’m playing card tricks in the dark. What can I say? It amuses me to build upside-down cloud castles.

36

DaveL 03.09.12 at 12:44 am

#30 @Charlie Stross: Apropos The Difference Engine: I’d like to plead guilty to having read it only the once, around 22 years ago, and had forgotten everything about it that Henry points to as an angle of similarity.

If this isn’t a question which should go over to your own blog, I’m curious what precursors you did read and absorb, such as my favorite of all, Swanwick’s Vacuum Flowers.

Actually, I suppose the ur-precursor of the libertarian paternalist AI would be Mike in The Moon is a Harsh Mistress.

Finally, I only really picked up on the AI (“Narratron” … really?) of The Difference Engine on my second reading. But then I usually read anything I like twice or more.

37

David Steinsaltz 03.09.12 at 8:04 am

#16 itself should be marked as a deplorable misreading of Leslie Fiedler, though I can’t judge whether it is “willful”.

38

ajay 03.09.12 at 11:08 am

In Rule 34 I decided to ditch heteronormativity and invert some of the usual cliches about non-heterosexuals. The only apparently “normal” heterosexual male in the book is the Toymaker; the other main protagonists are all LGBT and mostly well-adjusted.

Huh. That completely passed me by. I’m not sure if that says good things or bad things about
a) my tolerant outlook
b) the amount of attention I was paying while reading
c) Charlie’s writing ability.

I think it was just that I was distracted by the sheer weirdness of some of the plot elements. Pretty much from page 1 actually. CS could have made every character eight foot tall and paraplegic and I probably wouldn’t have noticed.

31: sorry about that.

39

Dan Nexon 03.09.12 at 2:00 pm

Quick comment:

1980s Theory: the author is dead.
2010s Theory: the author is online.

40

Henry 03.09.12 at 2:04 pm

bq. In Rule 34 I decided to ditch heteronormativity and invert some of the usual cliches about non-heterosexuals. The only apparently “normal” heterosexual male in the book is the Toymaker; the other main protagonists are all LGBT and mostly well-adjusted.

I actually didn’t notice this at all and wonder how many readers did – if not so many, it may say interesting things about how these norms are changing (at least among a heavily biased sample of f/sf readers).

41

Daniel Nexon 03.09.12 at 2:35 pm

“I actually didn’t notice this at all and wonder how many readers did – if not so many, it may say interesting things about how these norms are changing (at least among a heavily biased sample of f/sf readers).”

Isn’t the real radicalism of Rule 34, therefore, that it violates the “Everyone [in the future] is Bi” axiom of Scotts-inflected British SF?

42

Barry Freed 03.09.12 at 11:24 pm

@ajay 31: sorry about that.

No, I’m sorry, I didn’t mean to blame you. It’s my own damned fault, I was forewarned by Henry in the post “substantial spoilers beneath the fold,” yours was just the one I happened to stumble upon. [Posting this with my eyes cast down, trying not to accidently read any others, with any luck I’ll forget the one I saw when I get around to reading it this summer].

43

andrew 03.12.12 at 8:01 am

Prof. Farrell, reading your numerous (and very stimulating) posts on and reviews of science fiction books, it seems to me that science fiction nowadays is interesting not so much for the “hard science” and technological-speculation parts, but for the social science parts.

I think it is fair to say that much “science fiction” would be more appropriately called “social science fiction,” because the real meat of the books are often how changing patterns of technology affect social relations, norms, etc. – the attention is not so much on the technologies themselves

Comments on this entry are closed.