The Ghost in the Machine

by Henry Farrell on January 25, 2010

“Nicholas Carr announces his forthcoming book”:http://www.roughtype.com/archives/2010/01/the_shallows_ta.php

My next book, The Shallows: What the Internet Is Doing to Our Brains, argues that the tools we use to think with – our “intellectual technologies” – not only shape our habits of thought but exert an actual physical influence on the neurons and synapses in our brains.

ummm … not wanting to get too reductionist, but how could something that shaped habits of thought _not_ have consequences for physical processes with neurons and synapses and all that other good stuff? Also, I think the book would be _much_ better if it were titled _The Shallows: What the Internet is Doing to our Brainz. BRRRAIIINNZZZ ! ! !_, but then, I reckon that pretty well any book in this broad genre could be improved by “learning from the master”:http://en.wikipedia.org/wiki/Cell_%28novel%29 and adding some good zombie action.

{ 31 comments }

1

Salient 01.25.10 at 7:56 pm

Just piling on here,

…exert an actual physical influence on the neurons and synapses in our brains…

Protip for Carr: if you feel the need to emphasize what you’re saying by employing the word “actual” as a superfluous adjective, you’re getting too excited about whatever it is you’re saying. But for added emphasis, you can blame this overexcitement on what Internet-use has done to your neurons!

2

Robin Green 01.25.10 at 8:34 pm

Assuming that he means to suggest some sort of negative influence – will he manage to do what the illustrious Baroness Greenfield has singularly failed to do, and provide some actual evidence for this claim?

3

Substance McGravitas 01.25.10 at 8:57 pm

This is your brain. This is your brain on your brain. Any questions?

4

Cosma Shalizi 01.25.10 at 10:01 pm

Perhaps Carr is saying that the mechanism of action isn’t entirely through the endocrine system?

5

Kenny Easwaran 01.26.10 at 1:56 am

I assumed your link was to Pride and Prejudice and Zombies

6

dfreelon 01.26.10 at 3:14 am

Damn you McLuhan! Will we never be free of your unshakable grip on the public’s perception of mass media effects?!

7

tomslee 01.26.10 at 12:54 pm

I’m a regular reader of Carr’s blog and thought his last book (The Big Switch) very good. This one was sparked by the success of an essay in The Atlantic (Is Google Making Us Stupid?). I share in the doubts about yer actual brains, but I’m inclined to give Carr the benefit of waiting to see what he has to say before joining in.

8

Henry 01.26.10 at 1:33 pm

Oh, I thought that _The Big Switch_ was a good book too, and Carr is obviously very smart. I didn’t think that the essay which prompted this was very good or original though – perhaps the book is better.

9

will u. 01.26.10 at 3:10 pm

@ 5: Whereas I assumed it’d be to John Quiggin.

10

Timberlee 01.26.10 at 7:15 pm

I think it’s an important point. An obvious point, but an important one nonetheless that people are getting dumber by using Twitter and Facebook. People understand the danger of smoking or drinking alcohol but they don’t comprehend the dangers of other tools they use like Facebook and Twitter. A great discussion on the effects of the ubiquitous facebook photos and Twitter on people’s lives:
http://www.pandalous.com/topic/ubiquitous_facebook

11

Farren 01.26.10 at 10:22 pm

The “actual physical influence” thing is stating the massively obvious, but maybe it was just a poor choice of wording and he meant to express something more interesting.

As far as the Internet dumbing people down is concerned, my intuition from personal experience is the opposite. My three wonderful nephews are extraordinarily bright, and growing up around the Internet, they’ve soaked up vast amounts of information.

I’ve just finished playing a game online with one of them who lives on another continent and he speaks at a machine-gun pace. I’m often a bit dazed by the speed at which he responds to things I say, articulating himself exquisitely and at times parsing complex sentences and coming back with responses that show considerable depth of thought almost the moment I stop speaking. This from a kid who gets Internet withdrawal symptoms after a week of its absence.

12

Farren 01.26.10 at 10:28 pm

Or, “the kids are alright”. They’re just different. Yes, we’ve lost skills. I’ve little doubt that somewhere in the ancestory of all of us there were human being who had the extraordinary tracking ability and elephantine memory (like remembering where you buried an egg full of water in a vast desert) still evident in nomadic Khoi-San people, but its not like we didn’t gain stuff along the way. I know I shouldn’t prejudge but whenever I hear about stuff like this it always comes across as someone spending an awful lot of time examining the empty half of the glass and ignoring the other part.

13

Peter B. Reiner 01.27.10 at 12:11 am

Sorry to rain on your Carr-bashing party, but there is indeed evidence (reviewed here and here) out there that multitasking changes one’s brain. And while one might think it a trivial thing, I doubt that any of you would allow me to change your brain, in ways unknown to you, without your permission. That is, in essence, what we have all done.

14

Substance McGravitas 01.27.10 at 12:18 am

Sorry to rain on your Carr-bashing party, but there is indeed evidence (reviewed here and here) out there that multitasking changes one’s brain.

Can you list the things you can do that don’t change your brain? Asking for a perfect friend.

15

Farren 01.27.10 at 12:24 am

Peter, I’m not sure how linking to a bunch of opinions about changes in cognitive style bears out your claim of evidence, or addresses the issue I raised above, namely that we’ve been losing abilities as long as we’ve been acquiring new ones. Its not enough to show that an ability is lost. I can prove to you that 99% of urban dwellers have lost the tracking ability of our hunter-gatherer forebears. That does not establish that there has been a net loss of abilities that affect human welfare.

16

Farren 01.27.10 at 12:38 am

I’ve scanned my whole life, since long before the Internet, trying to get the essentials rather than read every word. Handy when Tolstoy blathers on for 4 pages about a freaking wedding dress. I stop and savour particularly attractive turns of phrase, but yeah, I miss a lot. I also got into the habit at a very early age of reading up to 9 books, both fiction and non-fiction, in a month, every month – sometimes several at once. Its equipped me with an encyclopedic knowledge. I started using the net in the days of Mosaic and it was a natural fit for my reading style. Unlike my dad at my age (40), who was similarly interested in all things scientific when he was younger, I’m still learning at a fairly rapid pace. At least one of my nephews is even more of an information soak, despite having to do two years of remedial schooling for dyslexia (from which he proceeded to a school for the gifted). And he can’t live without access to the net.

Again, this is all anecdotal, but I have a powerful intuition that these examinations of changes in cognitive style (a) wilfuly ignore any gains that might offset perceived losses and (b) are motivated by the soon-to-be historical experiences of a generation for whom the transition is not as comfortable.

17

text 01.27.10 at 1:29 am

I’ve scanned my whole life . . . [h]andy when Tolstoy blathers on for 4 pages about a freaking wedding dress.

That explains it.

I think the issue here isn’t whether our brains have changed before — I don’t think anyone is arguing with you on that point — but whether this particular change (losing attention span?) is a good change relative to whatever benefit you think is accruing. Are you proposing that our ability to learn is increased by a loss in attention span? Maybe the benefit is in self-regard?

18

Matthias Wasser 01.27.10 at 3:10 am

And while one might think it a trivial thing, I doubt that any of you would allow me to change your brain, in ways unknown to you, without your permission. That is, in essence, what we have all done.

But assuming a non-paradoxical reading of “allow… without one’s permission, that’s… exactly what you just did.

19

Matthias Wasser 01.27.10 at 3:14 am

Also: one often sees this sort of triviality employed in nature/nurture debates, as in: how can you say this is due to socialization… when men and women are actually different in their brains!?!

20

geo 01.27.10 at 3:22 am

trying to get the essentials rather than read every word

But sometimes — always in poetry, and often enough in prose — the rhythm, music, texture, interplay etc of the words is essential.

21

Cosma Shalizi 01.27.10 at 3:58 am

Let me try to lay out the steps Henry skipped as clear: Thought is one of the things the brain does. Any change in what you think has to be matched by, or if you like come from, a change in your brain. So when Carr says that the Web changes your thoughts, he adds not one little bit when he says that it changes your brain too. He could say the same about chess.

22

Peter B. Reiner 01.27.10 at 4:06 am

Farren @ 15. I concede your point about opinions, as the Edge comments are just that. But the second post I referred to focuses on a recent study in PNAS which is hardly opinion. To be fair, neither is it conclusive.

But that is just distraction from the most important point of all, which I shall take a moment to clarify. I have no need to become encyclopedic as Farren suggests @ 16; after all, I have access to an unprecedented storehouse of knowledge at my fingertips. Rather, I believe that the most important cognitive trait that we can facilitate is our ability to engage in reflexive thinking, the sort of considered and extended analysis which is increasingly rare. Multitasking – jumping from hyperlink to hyperlink, from blog to news to an academic paper – does nothing to enhance that ability and may even degrade it. I would submit that this is to our common detriment, and that our intellectual lives are somewhat impoverished by it (lively discussions such as this notwithstanding). But what is most interesting about the phenomenon to me is that it has occurred surreptitiously, certainly with good intent, but clearly with potential for untoward effects. Given how glued we all seem to be to our computers, it is worth at least reflecting for a moment on whether it is the best course for brains to follow.

For the record, even though today is my 55th birthday, I am hardly uncomfortable with the transition to modern technology. Indeed, the concern is how simple it is for me to be drawn into the damn thing.

23

Aulus Gellius 01.27.10 at 4:15 am

Now if you could show that intellectual technologies “shape our habits of thought but DO NOT exert anY actual physical influence on the neurons and synapses in our brains,” that would be pretty remarkable.

24

Nick Caldwell 01.27.10 at 4:32 am

I’m actually shuddering at the deep misanthropy and misogyny evident in that Pandalous discussion linked to by Timberlee.

It’s funny, isn’t it, how it’s always young women who are the archetypal “bad” technology users who are degrading themselves and others by its use. God, they’re taking photos of each other. Next thing it’ll be sex standing up, which might lead to dancing.

25

Doctor Science 01.27.10 at 4:35 am

Didn’t Socrates say something about how kids these days are writing things down and reading them, instead of memorizing them like WE DID uphill both ways, and how this made them a bunch of dumbies? Who play music awful music, too? What *do* they teach them in these schools?

My point is, keeping knowledge outside of one’s personal brain — offsite storage, I call it — means that each individual person may not be as personally smart, but the smartness each one can use is much greater.

My favorite analogy: Aristotle was famous for knowing a lot, and he was undoubtably smarter than I am. But 20 years ago I’d point out that I plus the Columbia Encyclopedia knew a lot more than Aristotle. Now, my 13-year-old plus the Internet knows *hugely* more than Aristotle, or even than I did 20 years ago.

26

geo 01.27.10 at 5:13 am

Dr. S: Perhaps poor, dumb old Socrates had a point: ie, that an aural culture has a distinctive and fragile beauty. That something has a distinctive and fragile beauty doesn’t mean that history must come to a dead stop for fear of its being lost. It means that one should reckon with the prospect of losing it and the possibility of preserving it, and not just leave the whole matter to chance, the market, the inexorable march of technology, etc. In the present case, it means that traditional literacy gives access to the matchless beauty of Elizabethan lyrics, Milton’s epics, Dickens’s and Tolstoy’s prose, etc., and that therefore we should take care that it doesn’t become vanishingly rare, the quaint indulgence of a miniscule minority.

27

b9n10nt 01.27.10 at 6:00 am

A “we” that “takes care of” valuable social practices must itself exist prior to any subject of it’s affection. I suppose that what cultural preservationists actually want is any benign social agent that can preserve something.

28

S. Turner 01.27.10 at 6:27 am

The offboard storage analogy is a good one but what about the quantitative qualitative shift that occurred in human evolution giving us this big brain? Won’t mounds of electronic data start to ‘walk like a duck ‘ at some point?

Physical labour was once loaded on to slaves. And over time, non human technology proved to be much more desirable for reducing the effects of toil. Human action has reached technological saturation. Naturally, everyone has physical health worries.

There never was such a thing as an intellectual slave even though slaves were consulted by their masters on intellectual matters. In most societies, the problem of tiresome and difficult intellectual labour was dealt with instead by the emergence or rescuing of a professional elite.

If a technology comes along which addresses the tiresome intellectual labour problem, then older adults understandably race to embrace it. All those facts we can’t remember? All the names and phone numbers we need handy to keep our lives going? All on the trusty laptop or whatever.

And if you’re younger and have been educated by an anti-memorization pedagogy, you don’t lose your cognitive skills unless you lose your device. Bringing me to my first and last point. We’re allowing our minds to relax and empty themselves of thought altogether — no, it’s not yoga — by ‘thinking’ with our devices, saving and savouring our memories with our devices, expressing our creativity with our devices and creating a hermeneutically sealed world of experience accessible only through our devices.

If I was regularly attached to one of these devices, I bet it would not be long before it would seem to be acting a lot like me; before it started to feel, in my head, like part of my brain or body, at any rate.

29

Zamfir 01.27.10 at 8:29 am

There never was such a thing as an intellectual slave even though slaves were consulted by their masters on intellectual matters.

To the contrary, intellectual slaves were run of the mill in the Roman empire. Teachers, bookkeepers, personal physicians and secretaries were usually slaves. The Roman empire had slave architects, slave playwrites, slave literary publishers.

30

Ah 01.27.10 at 8:51 am

A lot of this discussion assumes that what is important in being human and interacting with the world is knowledge – facts that might be stored in our brains or in google. But there is increasing evidence that social interaction is at least as important in being human. The Internet allows more social interaction (good) and different forms of social interaction (maybe bad) which at least at the moment lack the nonverbal cues (eye contact, smiles) that facilitate live interactions. These different kinds of social interactions will surely affect our brains and our society but I don’t think anyone can say if the effects are good or bad.

31

Doctor Science 01.28.10 at 12:59 am

Ah:

Exactly. I was just getting my Actual Evolutionary Biologist™ hat to say that there is no question that the driving force in the evolution of human intelligence is *social* intelligence. Knowing facts and skills about the (rest of the) physical world will only force intelligence to evolve so far. The evolution of social intelligence is a positive-feedback cycle: you become smarter to better predict what your fellows will do, and *they* become smarter and so harder to predict, so you become smarter. Next thing you know (geologically speaking) your brains have blown up like a balloon.

So I think S. Turner is quite wrong:

We’re allowing our minds to relax and empty themselves of thought altogether —no, it’s not yoga—by ‘thinking’ with our devices

— people who keep a lot of factual knowledge on the Internet aren’t emptying their minds of thought, they’re using more of their minds for their core pupose: learning about other people. Socializing, in other words.

Comments on this entry are closed.