Will reading this blog post decay your brain?

Reading this short blog post – or worse, simply skimming it so that you can justify RT-ing it – may cause real damage to your brain.  At least according to Nicholas Carr, whose provocative Atlantic article “Is Google Making Us Stupid?” has been expanded into a just released book titled The Shallows.

In his new treatise Carr claims that as we twiddle on Twitter and diddle on Facebook, absorbed in repetitive and addictive activities, our brains are getting rewired. And not for the better.

Because the Internet encourages and reinforces “cursory reading, hurried and disparate thinking and worse, superficial learning,” it significantly diminishes our capacity for sustained concentration, deep thinking, and long-term memory.

To make his point Carr cites his own vanishing attention span, recent neuroscience findings, and an undeniable study of academic research that shows scholars are taking the easy way out when it comes to citing sources. They apparently do what we all do: search Google and use whatever comes up at the top. “We live in a world of abundance but all read the same thing,” concludes Carr.

He offers more of his opinion in a column in yesterday’s Wall Street Journal.

The picture emerging from the research is deeply troubling, at least to anyone who values the depth, rather than just the velocity, of human thought. People who read text studded with links, the studies show, comprehend less than those who read traditional linear text. People who watch busy multimedia presentations remember less than those who take in information in a more sedate and focused manner. People who are continually distracted by emails, alerts and other messages understand less than those who are able to concentrate. And people who juggle many tasks are less creative and less productive than those who do one thing at a time.

He argues for the deeper reflection and exercises that give us greater control over our attention.

Reading a long sequence of pages helps us develop a rare kind of mental discipline. The innate bias of the human brain, after all, is to be distracted. Our predisposition is to be aware of as much of what’s going on around us as possible. Our fast-paced, reflexive shifts in focus were once crucial to our survival. They reduced the odds that a predator would take us by surprise or that we’d overlook a nearby source of food.

However, in his New York Times review today, Jonah Lehrer isn’t totally convinced. The author of How We Decide and Proust Was a Neuroscientist (both of which I highly recommend you read; good ways to up your concentration) reminds us that change in media habits and technology has always brought criticism. Socrates lamented books for creating a “forgetfulness of the soul”; telegrams were initially condemned for their pelting speed; radio and television poisoned our minds.

Lehrer, himself a highly regarded neuroscientist, further enlightens us to the fact that everything changes the brain. Countering Carr’s claims he points to other studies that conclude the Internet makes us smarter. They’ve shown that gaming improves our performance in cognitive tasks and that searching on Google, by forcing our selective attention and deliberate analysis, leads to increased activity in the dorsolateral prefrontal cortex.

Clay Shirky (how can we have this discussion without Clay weighing in) offered his thoughts in a counter essay to Carr yesterday. Shirky reminds us that returning to the pre-Internet era of the 1980s, isn’t all that desirable.  During that decade “we actually spent a lot more time watching Diff’rent Strokes than reading Proust.”

Shirky also reminds us that the mindless nonsense populating much of the web will give way to more thoughtful and meaningful content:

Of course, not everything people care about is a high-minded project. Whenever media become more abundant, average quality falls quickly, while new institutional models for quality arise slowly. Today we have The World’s Funniest Home Videos running 24/7 on YouTube, while the potentially world-changing uses of cognitive surplus are still early and special cases.

He goes on:

The response to distraction, then as now, was social structure. Reading is an unnatural act; we are no more evolved to read books than we are to use computers. Literate societies become literate by investing extraordinary resources, every year, training children to read. Now it’s our turn to figure out what response we need to shape our use of digital tools.

I guess I’m with Lehrer and Shirky. True, the more time I spend online, the less time I spend reading lengthy narratives, committing the ideas within them to memory, and thinking critically about their meaning.

But consider that I discovered Nicholas Carr’s new book online.  My search led me to multiple reviews, to Cory Doctorow’s essay on Writing in an Age of Distraction, and to the Clay Shirky/Nicholas Carr debate in yesterday’s Wall Street Journal.

This two-hour exercise inspired this post, enabled me to synthesize and organize links and sources (sorry if they distract you) and left me with a number of pieces to read, review and think about once I’m done writing this.

I think I’ll go offline now and concentrate on something really important. Your thoughts?  Please share.  And if you got this far, feel good about it.  You actually concentrated on something.

Want to read up and think on your own?  Here are some links.

Boston Globe Review of The Shallows

New York Times Review

In Search of Memory by Erik Kandel

Is Technology Producing A Decline In Critical Thinking And Analysis? by Patricia Greenfield

Shirky in Journal

Paper Cuts: Stray Questions for Nicholas Carr

Bloomberg/Business Week Review

Merlin Mann’s 43 Folders

24 comments
WoW_Account
WoW_Account

I think that technology both enhances and destroys the human mind. It depends on how you use the technology available, but truth be told, it has forged a generation of lazy people.

SusanOliver
SusanOliver

Like water, technology can stimulate or immerse the user. IT is neither healthy nor worthless; it is how one uses it that is the determinate.

Rasmus
Rasmus

What a well-written post. I feel smarter already!

It's definitely a topic that makes me think twice, not so much about whether we get dumber by reading online, but more about the influence of particularly Google. It's a very valid point, that the abundance is lost in what ends up in Google's top 10 search results. Which is actually kind of scary.
.-= Rasmus´s last blog ..Breaking Through the Creative Wall =-.

Matthew Piepenburg
Matthew Piepenburg

Carr (and Mr. Boches) raises a provocative issue—inspiring devotes and detractors, advocates and apologists. The internet in general (and twitter et al in particular) creates a kind of love-hate reaction. Personal philosophies on the “new media” –like any philosophy--are as diverse as the personalities who hold them (from J. Lehrer to Susan Jacoby, Clay Shirky to Harold Bloom). As such, a final answer will never please all temperaments. My prejudice is clear (though not necessarily correct): the new media bothers me.
It’s too much, offering too little, to too many.
Those who defend technology, innovation, the information boom etc legitimately remind us that all advances and “frontier-breaking” vehicles inevitably inspire reactionary growing pains and the comments of grumpy anachronisms like me (or even Socrates). But I am concerned. I fear that our twitter era (and twitter mind set) creates a flow of information that far outpaces the evolution of the maturing intellect, which requires patience, attention and depth—nouns not otherwise associated with the internet or Twitter… Sound bite data breeds sound bite-like reflection and sound-bite-like understanding. As an author, a caddy old scholar and lover of language, communication, friendship and lean-into-the table-dialogues, I sincerely feel that as a culture, we are increasingly less sophisticated, less expressive and less intelligent with words, feelings and ideas. Although I will concede that the internet, and twitter-like advances make us much, much more informed, this does not, however, make us more wise—and that is a distinction (opinion?) worth pondering.
Long before the internet, St. Exupery wrote from war torn France that we live in era that can manufacture 5000 pianos a day but not a worthy pianist. The observation resonates as much in 2010 as it did when penned in 1941. We need more focus, not more efficiency. We need to slow down, not speed up. We need deeper information, not faster information. I recently read an old, old letter written from a 21-year-old pilot during the battle of Midway. The manner, style and depth of this young man’s observations were beautiful. Literally from another time. I would be hard-pressed to find a man of similar age in this era of the text message and “tweet” to communicate as profoundly or touch his audience as deeply. Perhaps I am guilty of romanticizing the past. Admittedly, the good ol’days weren’t always so good and the internet is hardly without advantages. Nevertheless, my net assessment remains: the internet offers more of less. Although I speak with broad (even unfair) generalizations with many worthy exceptions, I feel that the internet—and twitter in particular—predominantly breeds a distracted, rushed mind-set. The Twitter “way” replaces critical thinking with clever thinking; rather than send or respond to ideas from the perspective of patient, hard-won reflection, experience or knowledge—we have created a culture of rapid fire data dumps whose authors and audience often rely more on entertaining, arguing or self-advertising than, well…inspiring. Our era seems less interested in the focused study of great ideas, connections, and exchanges. But everything that is wonderful requires precisely that: focus, depth, time. One does not learn about literature through Cliff’s Notes; one does not learn the complexities of a foreign language or culture through CD ROMs. Nor is any understanding the immense complexity of the human heart and mind won through a string cite of emails, tweets or miracles in the Facebook inbox. Again, growth in any human endeavor requires precisely what Twitter does not: time, complexity, perspective.
The often dangerous intersection between actual life and online life touches numerous themes, but for now, just consider the theme of friendship. A timeless and universal ledger of great minds-- Aristotle, Montaigne, Goethe, Emerson…all considered friendship as man’s greatest wealth and considered one blessed if he/she could count in a lifetime at least one good friend. Today, the average Tweeter can have 40 to 400 “friends” online. But what does this say about us? About our connections? The memorable quality and content of our exchanges? Of course, there is great value as well as great personal and commercial efficiency to be found online or in a tweet. There is rapid pace networking, blitzkrieg flirting, advertising, PR-ing, dating, info-sharing, brand awareness making, etc. etc. In short—fantastic efficiency (and an immense amount of narcissism—See Jean Twenge’s new book on this topic). But for me, like St. Exupery, I’d prefer: more art and less production; more intimacy and less efficiency; more letters and less texts; more sincerity and less advertising; more flesh and less pixels; more focus and less entertainment; more depth and less distraction—in short: more eyes looking into mine and less looking into a screen.
And yet how did I convey all this? On a blog. On a screen…in an electronic chatroom, alas, full of strangers. Perhaps the bigger question is always less about technology’s universal appeal and more about man’s universal need: to simply connect.

Jeremy Morris
Jeremy Morris

Its got to be more healthy to twiddle and diddle with Facebook and Twitter than what we used to twiddle and diddle with. Don't you think?
.-= Jeremy Morris´s last blog ..4 Reasons To Love The Apple iPhone 4…video! =-.

Zeenat Rasheed
Zeenat Rasheed

Thanks for the great post, Edward. I think both sides have merit in their argument. When I first began using Twitter, I was completely overwhelmed by the amount of information being thrown at me and the social/intellectual pressure to keep on top of it all, so much so that the 'hurr' sound of TweetDeck (going off as each tweet came in) began to have a Pavlovian effect of stressing me out each time I heard it. But I learned to organize the way I used Twitter and prioritize its role in my life as a source of information (moving to HootSuite helped too!). Once my brain was wired to sort through the data, Twitter became indispensable to me - as a way to research, stay current on news and network/create relationships.

That being said, I do feel like I've lost the ability to concentrate on lengths of text, and reading - particularly on screen - has become a challenge. As much as I believe that all new media deserves a shot and we need to learn how it can become relevant to our lives, I can't find any way to spin this into a positive. We shouldn't allow older, necessary skills to languish at the expense of newer, sexier ones.

So the only thing to do is practice. I'm committed to reading long text as much as possible and sticking with finishing the books, essays and articles I pick up - without skimming - so that my brain remains trained and agile enough to process both types of information.

It's all just part of the evolutionary process after all :-)

Addy Drencheva
Addy Drencheva

Hi Edward,

As a typical Gen Y-er, I would argue that the Internet makes us smarter, but it depends on how we use it. It is a great way to find information and use your analytical skills to sort it out. The vast amount of information on the Internet doesn’t necessary make us smarter, but the way we use it and engage with it does. Imagine if you had to go to a library ( a real one) just to find something that might be available online. Imagine how much time you would waste. Time you could use to develop your skills or to come up with ideas.

People often say that because of the Internet we just skim through everything instead of engaging with it. I believe skimming is a good way to find if an article is relevant to your needs and if so you can engage further with it. In an age when we have so many options we need time because a day unfortunately has only 24 hours.

The other aspect of my point relates to what Jeff said. It is not only about knowing information, but about engaging with it and understanding it. Engaging in healthy debates with other people and hearing their points. The Internet is perfect for that. Imagine if every time you had a thought and you had to send a letter to one of the people on your blogroll. Then you had to wait for a response… Not the best option.

With all this said, I also think that it is a matter of how you use the Internet. If you only seek like-minded people and articles that support your views and thoughts then it might make you dumber or at least deprive you from the opportunity to become smarter.

Vince LaVecchia
Vince LaVecchia

Hello, thanks. Two thoughts:

1. I agree with Carri about learning from Twitter. I was always amazed at how much my dad knew about lots of things when I was a kid. But he had to spend WAY more time reading stuff to learn what I can learn faster. It's about will and motivation.

2. It's too early to predict how all this will affect us. Let's just wait and see and enjoy it while it lasts.

VRL

Tive
Tive

Great post Edward. Stimulating as usual. And I suppose that that is the point - stimulation.

New tech presents new concerns. As Shirky mentions, before the web, TV and radio poisoned our minds. Even novellas were seen as sinful in contrast to poetry. Yet the ability to read a book nowadays is seen as evangelical compared to looking at a screen. A TV was more interesting than a radio which was more interesting than a living room. And an online screen is more interesting than an offline screen.

I cannot speak for anyone other than myself when I say if my brain is rewiring itself, it is doing so to constantly seek new information and value in that information. I do not watch hours of X factor or the Wire each week, but I do scour the web for information that feeds my brain. I think the latter is better for me. But having read the WSJ article, I do agree that whilst concentrating on one thing, I know there is more stimulus to be found elsewhere and I am quicker to drop an activity if it doesn't reward my stimulation.

Whilst studying English at university, I would spent hours in the library whilst writing essays, poring over stacks of books to find opinions from critics and biographers who either reinforced my thoughts or dismissed them but all who managed to help me mould my own thoughts. Whilst writing blog posts I do the exact same thing except I only use a laptop. But I still complete the task and certainly no longer for my ability to multitask.

There is also a concern around social etiquette. Why do we find it normal when two people converse on a train but annoying when one person holds a conversation over a phone in the same scenario? For the same reason that my wife finds it frustrating for me to check my phone whilst I am a passenger in the car but is perfectly happy for me to fall asleep or sit in silence. I am the same whilst she is the passenger by the way. I think the 'it is not good for you' opinion may often be employed as an argument to avoid bad etiquette. But it is often ill-informed and based purely on our own innate knowledge which is a poor starting point.

It's great therefore to read scientific theory around the subject but as concerned as we may be about the dopamine hit we achieve from a notification, I am sure people are no less adverse to articles and books that say we are doomed by our behaviour. Do we need 'deep' thought? Are we all philosophers? Or do we need interaction? What is better - the ability to discover, comment and stimulate debate or the ability to sit in silence for an hour and finish a novel?
.-= Tive´s last blog ..It is Friday, and it is sunny outside, so excuse me for thinking... =-.

Gretchen Ramsey
Gretchen Ramsey

There's clearly a neural-evolution afoot, but we are thinking more, not less. And, yes, deep thought on a subject, or understanding, as Jeff put it, are key indicators of true knowledge. I'd be careful though to equate deep thought with reading books or long tomes of text as is suggested above. I'm certain our poetic heroes Wallace Stevens, T.S. Eliot, Gertrude Stein, et al., would be sorely surprised to find out neuroscientists and thought leaders equate depth of knowledge with reading books --or reading at all, for that matter. What about other art forms? Exposure to human experience and information access cannot be harming us. Thank you for presenting both sides of the argument, Edward.
.-= Gretchen Ramsey´s last blog ..Social Etiquette (or How to Not Piss Off Your Fan Base) =-.

Carri Bugbee
Carri Bugbee

Great post. I've been joking for a long time that social media gave me A.D.D. – because I think I have it now and I never felt that way in the past.

I also joke that I've learned more in 2+ years of participation on Twitter than I learned in all my college years. And that's pretty damn accurate. I've become a voracious consumer of information. While not all of it is "deep," all of it does interest and/or amuse me. Perhaps most importantly, I'm constantly being exposed to stuff I wouldn't seek out myself and I'm connected with thousands of people I couldn't ordinarily meet. This has expanded my world – and my world-view – exponentially. There's a lot to be said for that. And I still read books. Every day.

I realize I’m easily distracted in ways I wasn’t in the past. But I also realize I’m on the bleeding edge of online consumption. I think we've have more powerful ways to parse information in the next wave – and I’ll bet Shirky would concur. We’ll look back at this time where we just blast ourselves with the firehose everyday and say “What were we thinking?!” But we need critical mass of content before it becomes a viable endeavor to filter it.

That said, I do worry about youngsters who may not grow up learning to concentrate on “deep” tasks. When I was young, we had no such things as personal computers (Shhh!!) so I read a lot. I hope all the connectiveness becomes a gateway to in-depth information gathering for the next generation – not merely a substitute.

@CarriBugbee
Social profiles: http://www.CarriBugbee.com

Catherine Ventura
Catherine Ventura

Thanks for a very thought-provoking post. I think Carr's criticisms though may be a bit "short-sighted" given that short-form is nothing new. CNN Headline News debuted in 1982, Reader's Digest in 1922, Morse's telegraph in 1837, and Giotto painted some highly abridged versions of New Testament stories in 1305. Instead of lamenting the abundance of material calling out for our attention, I think what we should be celebrating is the remarkable abundance of access now available to a much broader group. We've already seen the short-term benefits -- an explosion of creativity, dialogue and innovation. Different and evolving does not necessarily equal worse!

Jeff Shattuck
Jeff Shattuck

I don't really know if I understand Carr's point, but one thing I will say for the text/twitter world is that if you only read in short bursts, well, you'd be one of those annoying people who have an opinion on everything but understand nothing. As Einstein said, "Any fool can know, the point is to understand." And understanding takes take time, not just because you have to read, but also because you have to discuss and debate and a try your own thoughts on others and fail a few times. The Internet is great for this, of course, so if a bursty text leads you to read Wealth of Nations cover to cover and then you go on HuffPot to debate, fabulous. If you stop at stuff like "Goldman bonuses high as ever", well, you're not someone I would want to talk to!

Mel Exon
Mel Exon

Hi Edward

Thank you for this post.

Sometimes I wonder whether we need polarized points of view just to function as a society, much as the brain has left and right hemispheres... Both points of view do ring true to some degree (sure, some of us are energised by the experience of white water rafting the floods of data out there, but who doesn't feel an accompanying, natural sense of unease?)

If you're looking for other references, these may be useful:

1. We picked up on this topic in a BBH Labs post here: http://bbh-labs.com/the-coming-age-of-augmentation. I found what Jamais Cascio has to say about 'fluid intelligence' particularly interesting. Versus drowning in a deluge of distraction as described by Carr, he argues this represents a lasting, positive development (in other words, considerably more than a sort of short term, cognitive coping strategy):

“Fluid intelligence doesn’t look much like the capacity to memorize and recite facts, the skills that people have traditionally associated with brainpower. But building it up may improve the capacity to think deeply that Carr and others fear we’re losing for good. And we shouldn’t let the stresses associated with a transition to a new era blind us to that era’s astonishing potential. We swim in an ocean of data, accessible from nearly anywhere, generated by billions of devices. We’re only beginning to explore what we can do with this knowledge-at-a-touch…

…Strengthening our fluid intelligence is the only viable approach to navigating the age of constant connectivity.”

2. A point Shirky made about 'cognitive surplus' in the same WSJ piece you cite is also worth calling out:

"This linking together in turn lets us tap our cognitive surplus, the trillion hours a year of free time the educated population of the planet has to spend doing things they care about. In the 20th century, the bulk of that time was spent watching television, but our cognitive surplus is so enormous that diverting even a tiny fraction of time from consumption to participation can create enormous positive effects."

Given these examples, where I personally nett out on this debate probably isn't too surprising: I'd prefer we focus our energies on finding ways to make this work for all of us - teach ourselves and one another to swim like pros, if you will (and if you don't mind the water analogy being stretched to breaking point) - instead of sitting on the receding shore, trying to turn back the tide.
.-= Mel Exon´s last blog ..How the CIA define problems & plan solutions: The Phoenix Checklist =-.

Rufus Dogg
Rufus Dogg

Like water follows the path of least resistance, so too does human endeavor. Why run when you can walk; why stand when you can sit; why sit when you can lie down.

Why learn to spell when you can spell-check; why memorize when you can Google; why read when you can skim; why comment on a blog when you can RT the link.

My evidence for us getting dumber just involves me popping my head out and looking around at my beer-drinking, porn-watching-on-the-Internet unemployed neighbors who have college-grad-kids moving back home. I guess I should write a book about that, but then I have a blog and that is much easier.

I'm getting tired and I feel an original thought coming on. I have to go lie down.

PS I discovered Nicholas Carr's book reading the WSJ. And I could not order it on my iPad. I don't know what that means.
.-= Rufus Dogg´s last blog ..If you would have let me go to the party, this would not have happened OR how to blame the other guy for you being stupid =-.

Aaron
Aaron

I would have to agree about the addictiveness of the Internet. Each morning I wake up, fire up my RSS feeds, and gorge on the constant streams of information. I've gotten better at narrowing those streams and turning down the volume to keep from becoming overwhelmed, but the sheer amount of information available makes me concentrate less on what I'm reading because of a creeping dread that I'm not getting to all the information I need/want to ingest. But at the end of the hour or so I spend online, I'd be hard pressed to recite what exactly it is that I learned.

But I have a positive view of the future. I feel that humans will learn to ride the wave of information in a way that will become more and more efficient. I find myself using technology to cut down on the amount of information I consume. And I have started to commit myself to reading books again, specifically to help buttress my concentration.

As for authors, Richard Restak offers some good tips on improving concentration. I'd love to hear suggestions of others.

edward boches
edward boches

Gretchen:
Good points as always. As Lehrer noted Socrates was bummed when man started to look outward instead of inward. And Shirky has reminded us that the only reason books are, on average, 200--300 pages long is due to the limitations (or capacity) of early binding systems. So in many ways, technology has constantly influenced how we engage with information and knowledge. First it's disruptive (books, printing presses, telegraph, radio, TV, computer, cell phone, Internet), then it becomes that status quo. That which we initially resist we later regret losing, replaced by something new.

Aliera Peterson
Aliera Peterson

Carri—

I'll admit that though I'm one of the youngsters out there who did grow up with computers and all the distractions inherent in online platforms like Facebook and Twitter, I also find pleasure in the simple things, like curling up with a good book.

While I do find myself more susceptible to distractions when online, there's something about opening the cover of a book that still resonates in some deep, sacred part of me. Within a page or two, all distractions slip away and I'm caught up in whatever subject I have in hand.

Still I can't deny that we're all moving (at various speeds) toward that voracious and insatiable consumption of online content. The gleam of constant updates and stimulating information is hard to resist.

For some, that dogged pursuit is a necessary part of their industry. For others, it's the satisfaction in knowing that the answers to a million questions can be found via the Google toolbar, if you just know how to ask.

As with most things, there will always be those who cope with the cross-over between "old" and "new" better than others, those who will ignore both in the search of something faster, brighter, better and those who ride the waves and lulls without concern or comprehension.

Good luck to all as we work to find our place in the future of the digital data stream.
.-= Aliera Peterson´s last blog ..Block by Block — Start Something Clean =-.

edward boches
edward boches

Jeff:
"Goldman bonuses...." makes me mad enough that I might stop there, along with the next piece on how BP knew or expected disaster based on pressure tests. So don't necessarily agree with that; critical thinking can start with any soundbite, yes? As for Carr, I simply think he's one of the many who argue that our limited attention spans, unwillingness to spend serious time learning or mastering or studying a subject, and instead turning into surface skimmers who float across a broad range of subjects without going deep on any, is, in the long term, bad for our cognitive abilities. He may or may not be right, just as those who argue that cell phones will create a future generation suffering from incurable brain tumors may or may not be right. However, in an effort to come to our own conclusions, without depending on whatever Google delivers when we conduct a search, it might be wise to study, absorb, learn and memorize some of hte important stuff. Just saying.

edward boches
edward boches

Mel:
It was worth writing this post just to have your comment. As always, I learn more from my readers than I could ever teach them. So thanks. I will get on the fluid intelligence topic and add that my list of things to actually concentrate on. I think we're both in agreement; without the web and exchanges like this our own continued learning would be stunted.

Rufus Dogg
Rufus Dogg

Arghhh.. too late...
.-= Rufus Dogg´s last blog ..If you would have let me go to the party, this would not have happened OR how to blame the other guy for you being stupid =-.

edward boches
edward boches

Rufus:
Don't think too hard about all this. You'll hurt yourself. ;-)

edward boches
edward boches

Aaron:
Thanks for reading and the great comment. Will check out Restak. I find myself in the same position, both personally and professionally. Though at the same time the knowledge and contacts I find online are tremendously helpful with any endeavor. And, in fact, dialog such as this helps me think and develop ideas and POVs. I agree that we will find new ways to develop our critical thinking in light of technological change. Who says books were the be all end all anyway? Publishers.

Trackbacks