Is Google Making Us Stupid?

From Citizendium
Jump to navigation Jump to search
This article is basically copied from an external source and has not been approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.
The content on this page originated on Wikipedia and is yet to be significantly improved. Contributors are invited to replace and add material to make this an original article.

"Is Google Making Us Stupid?: What the Internet is doing to our brains" is a magazine article by technology writer Nicholas G. Carr highly critical of the Internet's effect on cognition. It was published in the July/August 2008 edition of The Atlantic magazine as a six-page cover story.[1] The essay builds upon Carr's book The Big Switch: Rewiring the World, From Edison to Google, in particular the last chapter, "iGod".[2] Carr's main argument is that the Internet might have detrimental effects on cognition that diminish the capacity for concentration and contemplation. As the title indicates, the article specifically targets Google, although it also generalizes about the cognitive impact of the whole Internet and World Wide Web.[3][4][5]

The essay was extensively discussed in the media and the blogosphere, with reactions to Carr's argument being polarised. At the Britannica Blog, a part of the discussion focused on the apparent bias in Carr's argument toward literary reading. In Carr's view, reading on the Internet is generally of a shallower form in comparison with reading from printed books in which he believes a more intense and sustained form of reading is exercised.[6] Elsewhere in the media, the Internet's impact on memory retention was discussed; and, at the online scientific magazine Edge, several argued that it was ultimately the responsibility of individuals to monitor their Internet usage so that it does not impact their cognition.

Although there have yet to be any long-term psychological or neurological studies that have yielded definitive results verifying Carr's argument, a few studies have provided glimpses into the changing cognitive habits of Internet users.[7] A UCLA study led some critics to wonder whether a greater breadth of brain activity — a pattern that was present in the functional magnetic resonance imaging (fMRI) scans of users performing Internet searches — was potentially facilitating reading and cognition or, on the contrary, overburdening the mind, and whether the additional presence of brain activity in regions known to control decision-making and complex reasoning skills had any correlation to quality of thought.

Background

Prior to the publication of Carr's The Atlantic essay, critics had long been concerned about the potential that electronic media had to displace literary reading with other types of reading.[8] In 1994, American academic Sven Birkerts published a book titled The Gutenberg Elegies: The Fate of Reading in an Electronic Age,[9] consisting of a collection of essays that declaimed against the declining influence of literary culture — the tastes in literature that are favored by a social group — with a central premise among the essays asserting that alternative delivery formats for the book are inferior to the paper incarnation.[10] [11][12] Birkerts was spurred to write the book after his experience with a class he taught in the fall of 1992 in which the students exhibited little appreciation for the literature assigned to them, stemming from, in his opinion, a shortfall in their aptitude for deep reading, a term coined in his book.[13][14][15]

In 2007, developmental psychologist Maryanne Wolf took up the cause of defending reading skills and literary culture in her book Proust and the Squid: The Story and Science of the Reading Brain, approaching the subject matter from a scientific angle in contrast to Birkerts' cultural-historical angle.[4][11][16][17] Carr duly attributed some of his statements in his essay to Wolf's book, as well as quoting from an interview with her in which she discussed the processes involved in "deep reading."[18] The term "deep reading" was used in the essay to describe the type of reading that Carr felt he was increasingly struggling to achieve. The term "deep reading" (as explained above) had been coined by Sven Birkerts in his The Gutenberg Elegies, though it was later, independently, defined by Wolf as a set of cognitive processes in an essay co-written with Mirit Barzillai, "The Importance of Deep Reading."[14][19][20][21][2] In their essay, "deep reading" is defined as "the array of sophisticated processes that propel comprehension and that include inferential and deductive reasoning, analogical skills, critical analysis, reflection, and insight."[22]

Wolf had made her own comments on the potential drawbacks of screen reading in essays published concurrent with her book's release. In The Boston Globe, she expressed her grave concern that the development of knowledge in children who are heavy users of the Internet could produce mere "decoders of information who have neither the time nor the motivation to think beneath or beyond their googled universes", and cautioned that the web's "immediacy and volume of information should not be confused with true knowledge".[23] Wolf contended, in an essay published by Powell's Books, that some of the reading brain's strengths could be lost in future generations "if children are not taught first to read, and to think deeply about their reading, and only then to e-read".[24] Preferring to maintain an academic perspective however, Wolf firmly asserted that her speculations had not yet been scientifically verified but deserved serious study.[19][20]

Carr had been inspired to write "Is Google Making Us Stupid?" due to difficulties he found he was having in remaining engaged in the reading process with, not only books he had to read, but even books he considered very interesting.[5] He has stated that the material in the final chapter "iGod" of his 2008 book The Big Switch: Rewiring the World, From Edison to Google provided a basis for the essay.[25]

Synopsis

At the start of the essay, Carr says that his recent difficulties with concentrating while reading books and long articles could be due to spending a lot of time on the Internet. He posits that regular Internet usage may have the effect of diminishing the capacity for concentration and contemplation. He prefaces his argument with a couple of anecdotes from bloggers on their changing reading habits, as well as the findings of a 2008 University College London study titled "Information Behaviour of the Researcher of the Future" which suggests the emergence of new types of reading. He cites scholar and author Maryanne Wolf for her expertise on the role of media and technology in learning written languages, explaining that unlike speech, an innate ability hardwired into the human brain, the ability to read has to be taught in order for the brain to rearrange its original parts for the task of interpreting symbols into words. He acknowledges that his argument about the impact of regular Internet usage on the brain does not, as of 2008, have the backing of long-term neurological and psychological studies. Carr further draws on Wolf's work, particularly her 2007 book Proust and the Squid, to relate his argument to the way in which neural circuits in the reading brain are shaped by the demands particular to each written language, such as Chinese, Japanese, and alphabet-based scripts.[25][26] Therefore, Carr asserts that the neural circuitry shaped by regular Internet usage can also be expected to be different from that shaped by the reading of books and other page-based written material.

Carr begins his argument by reasoning how the capacity to concentrate could be weakened by regular Internet usage. He mentions an historical example involving Friedrich Nietzsche's usage of a typewriter, a fairly new technology in the 1880s. According to German scholar Friedrich A. Kittler, Nietzsche's prose style changed when he started using a typewriter, which he adopted because of his developing difficulty with writing by hand due to failing eyesight. Carr proceeds to explain that scientific research in the field of neuroplasticity as of 2008 has demonstrated that the brain's neural circuitry can in fact be rewired. In the humanities, sociologist Daniel Bell coined the term "intellectual technologies" to describe those technologies that extend the brain's cognitive faculties, and Carr asserts that the human brain consequently adopts the qualities of these intellectual technologies, using the mechanical clock as an example. Then, Carr asserts that the cognitive impact of the Internet will be far more encompassing than any other previous intellectual technology because of the fact that the Internet is increasingly performing the services of most other intellectual technologies on the market. As the Internet replaces these intellectual technologies, it expands its impact. This trend is of particular concern, Carr asserts, because the capacity to concentrate is hindered by the Internet's prevalent style of presentation that surrounds content with distractions, such as ads and obtrusive notifications. He asserts that the impact of the Internet on the capacity to concentrate is also being compounded by traditional media as they adopt a style of presentation for their content that mimics the Internet, in order to remain competitive as consumer expectations change.

In the second part of his argument, Carr theorizes that the capacity to contemplate could also diminish as computer algorithms unburden an Internet user's brain of more and more of the painstaking knowledge work — the manipulation of abstract information and knowledge — that was previously done mentally and manually. In comparing the Internet with Frederick Winslow Taylor's management system for industrial efficiency, Carr makes the point that back then some workers complained that they felt they were becoming mere automatons due to the systemic application of Taylorism — a theory of management that analyzes and synthesizes workflow processes to improve labor productivity. Carr selects Google as a prime example of a company in which computer engineers and software designers have applied Taylorism to the knowledge industry, delivering increasingly robust information that could have the effect of minimizing opportunities to ponder ambiguities. Additionally, he asserts that the Internet's dominant business model is one that thrives as companies either collect information on users or deliver them advertisements; therefore companies capitalize on users who move from link to link rather than those who engage in sustained thought.

Finally, Carr places his skepticism in an historical context, reflecting upon how previous detractors of technological advances have fared. While often correct, Carr points out that skepticisms such as Socrates' concerns about written language and the 15th century Venetian editor Hieronimo Squarciafico's concerns about printed works failed to anticipate the benefits that these technologies might hold for human knowledge. As an afterthought, a 2005 essay by playwright Richard Foreman is excerpted for its lament of the waning of the "highly educated and articulate personality".[27]

Reception

We can expect … that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

Nicholas Carr, "Is Google Making Us Stupid?".[21]

Carr's essay was widely discussed in the media both critically and in passing. While English technology writer Bill Thompson observed that Carr's argument had "succeeded in provoking a wide-ranging debate",[4] Damon Darlin of The New York Times quipped that even though "[everyone] has been talking about [the] article in The Atlantic magazine", only "[s]ome subset of that group has actually read the 4,175-word article, by Nicholas Carr."[28] The controversial online responses to Carr's essay were, according to Chicago Tribune critic Steve Johnson, partly the outcome of the essay's title "Is Google Making Us Stupid?", a question that the article proper doesn't actually pose and that he believed was "perfect fodder for a 'don't-be-ridiculous' blog post"; Johnson challenged his readers to carefully consider their online responses in the interest of raising the quality of debate.Cite error: Closing </ref> missing for <ref> tag[29][30] Calling it "the great digital literacy debate", British-American entrepreneur and author Andrew Keen judged the victor to be the American reader, who was blessed with a wide range of compelling writing from "all of America's most articulate Internet luminaries".[30] Academic Maryanne Wolf, however, found the responses to Carr's essay on the Britannica Blog to be "reductionistic," particularly singling out Clay Shirky's comment.[18]

Book critic Scott Esposito pointed out that Chinese characters are incorrectly described as ideograms in Carr's essay, an error that he believed undermined the essay's argument.[31] The myth that Chinese script is ideographic had been effectively debunked in scholar John DeFrancis' 1984 book The Chinese Language: Fact and Fantasy;[32] DeFrancis classifies Chinese as a logosyllabic writing system.[33] Carr acknowledged that there was a debate over the terminology of 'ideogram', but in a response to Esposito he explained that he had "decided to use the common term" and quoted The Oxford American Dictionary to demonstrate that they likewise define Chinese characters as instances of ideograms.[34]

Writer and activist Seth Finkelstein noted that predictably, several critics would label Carr's argument as a Luddite one,[35] and he was not to be disappointed when one critic later maintained that Carr's "contrarian stance [was] slowly forcing him into a caricature of Luddism".[36] Then, journalist David Wolman, in a Wired magazine piece, described as "moronic" the assumption that the web "hurts us more than it helps", a statement that was preceded by an overview of the many technologies that had been historically denounced; Wolman concluded that the solution was "better schools as well as a renewed commitment to reason and scientific rigor so that people can distinguish knowledge from garbage".[37]

Several prominent scientists working in the field of neuroscience supported Carr's argument as scientifically plausible. James Olds, a professor of computational neuroscience, who directs the Krasnow Institute for Advanced Study at George Mason University, was quoted in Carr's essay for his expertise and upon the essay's publication Olds wrote a letter to the editor of The Atlantic in which he reiterated that the brain was "very plastic" — referring to the changes that occur in the organization of the brain as a result of experience. It was Olds' opinion that given the brain's plasticity it was "not such a long stretch to Carr's meme".[38] One of the pioneers in neuroplasticity research, Michael Merzenich, later added his own comment to the discussion, stating that he had given a talk at Google in 2008 in which he had asked the audience the same question that Carr asked in his essay. Merzenich believed that there was "absolutely no question that our brains are engaged less directly and more shallowly in the synthesis of information, when we use research strategies that are all about 'efficiency', 'secondary (and out-of-context) referencing', and 'once over, lightly'".[39] Another neuroscientist, Gary Small, director of UCLA's Memory & Aging Research Center, wrote a letter to the editor of The Atlantic in which he stated that he believed that "brains are developing circuitry for online social networking and are adapting to a new multitasking technology culture".[40]

Initially, at an Ad Age Madison & Vine conference, Google CEO Eric Schmidt dismissed Carr's argument;[41] however, many months later, he stated on The Charlie Rose Show that he was "[worried] that the level of interrupt, the sort of overwhelming rapidity of information — and especially of stressful information — is in fact affecting cognition. It is in fact affecting deeper thinking. I still believe that sitting down and reading a book is the best way to really learn something. And I worry that we’re losing that."[42][43][44][45]

Testimonials and refutations

In the media, there were many testimonials and refutations given by journalists for the first part of Carr's argument regarding the capacity for concentration; treatments of the second part of Carr's argument regarding the capacity for contemplation were, however, far rarer.[46] Aside from columnist Andrew Sullivan's disclosure that, in comparison with when he grew up, he had little leisure time at his disposal for contemplation,[47] the anecdotes provided by journalists that reflected on the capacity to contemplate were described in the context of third parties, such as columnist Margaret Wente's anecdote about how one consultant had found a growing tendency in her clients to provide ill-considered descriptions for their technical problems.[46][48]

In one anecdote, columnist Leonard Pitts of The Miami Herald described his own difficulty sitting down to read a book, reporting that he had begun to feel like he "was getting away with something, like when you slip out of the office to catch a matinee".[49] Technology evangelist Jon Udell admitted that, in his "retreats" from the Internet, he sometimes struggled to settle into "books, particularly fiction, and particularly in printed form".[50] He found portable long-form audio to be "transformative", however, because he can easily achieve "sustained attention", which makes him optimistic about the potential to "reactivate ancient traditions, like oral storytelling, and rediscover their powerful neural effects".[12][50]

Firmly contesting Carr's argument, journalist John Battelle praised the virtues of the web: "[W]hen I am deep in search for knowledge on the web, jumping from link to link, reading deeply in one moment, skimming hundreds of links the next, when I am pulling back to formulate and reformulate queries and devouring new connections as quickly as Google and the Web can serve them up, when I am performing bricolage in real time over the course of hours, I am 'feeling' my brain light up, I and [sic] 'feeling' like I'm getting smarter".[4][51] Web journalist Scott Rosenberg reported that his reading habits are the same as they were when he "was a teenager plowing [his] way through a shelf of Tolstoy and Dostoyevsky".[52] In book critic Scott Esposito's view, "responsible adults" have always had to deal with distractions, and, in his own case, he claimed to remain "fully able to turn down the noise" and read deeply.[31][46]

Analysis

Technological progress

In response to Carr's essay, a couple of critics countered with a hardline position regarding whether or not a society can control technological progress. At the online scientific magazine Edge, Wikipedia co-founder Larry Sanger argued that individual will was all that was necessary to maintain the cognitive capacity to read and think. Carr described this argument as "a variation on the old 'guns don’t kill people; people kill people' theme." In his view, "We control some aspects of our technologies, but our technologies control some aspects of us."[3] Computer scientist and writer Jaron Lanier rejected the idea that technological progress is an "autonomous process that will proceed in its chosen direction independently of us".[29] Lanier continued to argue that technology was in fact hindered by the idea that "there is only one axis of choice" which is either pro- or anti- when it comes to technology adoption.[29] Carr had set forth his argument on the subject of technological progress in his book The Big Switch, believing that an individual's personal choice toward a technology had little effect on technological progress.[25][53] Carr had used a similar view to Lanier's, written out by American historian Lewis Mumford in his 1970 book The Pentagon of Power, to make his argument in The Big Switch. Mumford had suggested that the technological advances that shape a society could be controlled if the full might of a society's free will were employed,[25][54] but according to Carr, this view was incorrect because it regarded technology solely as advances in science and engineering rather than as an influence on the costs of production and consumption. Economics were a more significant consideration, Carr wrote in The Big Switch, because in a competitive marketplace it is the most efficient methods of providing an important resource that prevail. As technological advances shape society, in Carr's opinion an individual could choose to resist the effects of new technologies, but his lifestyle would "be lonely and in the end futile." Continuing with his line of thought, Carr explained that despite a few abstaining individuals, new technologies would, nevertheless, shape economics which in turn would shape society.[25][53] In a 2009 interview question about his own personal choice in regard to Internet usage, Carr said "If you cherish the ability to concentrate deeply and be reflective, you need to set aside time to read and think every day, so that those circuits in your brain don’t get erased," and he acknowledged the difficulties in doing so when he commented that "There are broad social and economic changes underway that reward Internet use."[3]

A focus on literary reading

The selection of one particular quote in Carr's essay from pathologist Bruce Friedman, a member of the faculty of the University of Michigan Medical School who had commented on a developing difficulty he was having with reading books and long essays along with a specific mention of the novel War and Peace, was criticized for having a bias toward narrative literature. According to some critics, Carr's employ of the quote failed to represent other types of literature, such as technical and scientific literature, which had, in contrast, become much more accessible and widely read with the advent of the Internet.[21][55] At the Britannica Blog, writer Clay Shirky pugnaciously observed that War and Peace was "too long, and not so interesting", further stating that "it would be hard to argue that the last ten years have seen a decrease in either the availability or comprehension of material on scientific or technical subjects".[36] Shirky's comments on War and Peace were derided by several of his peers as verging on philistinism.[2][56][57] In Shirky's defense, inventor W. Daniel Hillis asserted that, although books "were created to serve a purpose", that "same purpose can often be served by better means". Even though Hillis considered the book to be "a fine and admirable device", he imagined that clay tablets and scrolls of papyrus, in their time, "had charms of their own".[29] Wired magazine editor Kevin Kelly believed that the idea that "the book is the apex of human culture" should be resisted.[10]Sven Birkerts differentiated online reading from literary reading, stating that in the latter the reader is directed within themselves and enters "an environment that is nothing at all like the open-ended information zone that is cyberspace" in which he feels psychologically fragmented.[58][59]

Coping with abundance

Abundance of books makes men less studious.

Hieronimo Squarciafico, a 15th century Venetian editor, bemoaning the printing press.[60][61]

Several critics theorized about the effects of the shift from scarcity to abundance of written material in the media as a result of the technologies introduced by the Internet. This shift was examined for its potential to lead individuals to a superficial comprehension of many subjects rather than a deep comprehension of just a few subjects. According to Shirky, an individual's ability to concentrate had been facilitated, in the past, by a "relatively empty environment" which had ceased to exist when the wide availability of the web proliferated new media. Although Shirky acknowledged that the unprecedented quantity of written material available on the web might occasion a sacrifice of the cultural importance of many works, he believed that the solution was "to help make the sacrifice worth it".[36] In direct contrast, Sven Birkerts argued that "some deep comprehension of our inheritance [was] essential", and called for "some consensus vision among those shapers of what our society and culture might be shaped toward", warning against allowing the commercial marketplace to dictate the future standing of traditionally important cultural works.[62] While Carr found solace in Shirky's conceit that "new forms of expression" might emerge to suit the Internet, he considered this conceit to be one of faith rather than reason.[2] In a later response, Shirky continued to expound upon his theme that "technologies that make writing abundant always require new social structures to accompany them", explaining that Gutenberg's printing press led to an abundance of cheap books which were met by "a host of inventions large and small", such as the separation of fiction from non-fiction, the recognition of talents, the listing of concepts by indexes, and the practice of noting editions.[55]

Impact of the web on memory retention

As a result of the vast stores of information made accessible on the web, a few critics pointed to a decrease in the desire to recall certain types of information, indicating, they believed, a change in the process of recalling information, as well as the types of information that are recalled. According to Ben Worthen, a Wall Street Journal business technology blogger, the growing importance placed on the ability to access information instead of the capacity to recall information straight from memory would, in the long-term, change the type of job skills that will be valued by companies hiring new employees. Due to an increased reliance on the Internet, Worthen speculated that before long "the guy who remembers every fact about a topic may not be as valuable as the guy who knows how to find all of these facts and many others".[46][63] Evan Ratliff of Salon.com wondered if the usage of gadgets to recall phone numbers, as well as geographical and historical information, had the effect of releasing certain cognitive resources that, in turn, strengthened other aspects of cognition. Drawing parallels with transactive memory — a process whereby people remember things in relationships and groups — Ratliff mused that perhaps the web was "like a spouse who is around all the time, with a particular knack for factual memory of all varieties".[58] Far from conclusive, these ruminations left the web's impact on memory retention an open question.[58]

Themes and motifs

Effect of technology on the brain's neural circuitry

Circular array of keys resembling a giant pin cushion atop a contraption containing a curved letter-size sheet of paper with writing on it
A model 1878 of the Malling-Hansen Writing Ball, which Nietzsche began using in 1882 when his poor eyesight made it difficult for him to write by hand.[64][65]

In the essay, Carr introduces the discussion of the scientific support for the idea that the brain's neural circuitry can be rewired with an example in which philosopher Friedrich Nietzsche is said to have been influenced by technology. According to German scholar Friedrich A. Kittler in his book Gramophone, Film, Typewriter, Nietzsche's writing style became more aphoristic after he started using a typewriter. Nietzsche began using a Malling-Hansen Writing Ball because of his failing eyesight which had disabled his ability to write by hand.[25][66] The idea that Nietzsche's writing style had changed for better or worse when he adopted the typewriter was disputed by several critics. Kevin Kelly and Scott Esposito each offered alternate explanations for the apparent changes.[29][31][67] Esposito believed that "the brain is so huge and amazing and enormously complex that it's far, far off base to think that a few years of Internet media or the acquisition of a typewriter can fundamentally rewire it."[31] In a response to Esposito's point, neuroscientist James Olds stated that recent brain research demonstrated that it was "pretty clear that the adult brain can re-wire on the fly". In The New York Times it was reported that several scientists believed that it was certainly plausible that the brain's neural circuitry may be shaped differently by regular Internet usage compared with the reading of printed works.[8]

Although there was a consensus in the scientific community about how it was possible for the brain's neural circuitry to change through experience, the potential effect of web technologies on the brain's neural circuitry was unknown.[38][39] On the topic of the Internet's effect on reading skills, Guinevere F. Eden, director of the Center for the Study of Learning at Georgetown University, remarked that the question was whether or not the Internet changed the brain in a way that was beneficial to an individual.[8] Carr believed that the effect of the Internet on cognition was detrimental, weakening the ability to concentrate and contemplate. Olds cited the potential benefits of computer software that specifically targets learning disabilities, stating that among some neuroscientists there was a belief that neuroplasticity-based software was beneficial in improving receptive language disorders.[38] Olds mentioned neuroscientist Michael Merzenich, who had formed several companies with his peers in which neuroplasticity-based computer programs had been developed to improve the cognitive functioning of kids, adults and the elderly.[38][68] In 1996, Merzenich and his peers had started a company called Scientific Learning in which neuroplastic research had been used to develop a computer training program called Fast ForWord that offered seven brain exercises that improved language impairments and learning disabilities in children.[69] Feedback on Fast ForWord indicated that these brain exercises even had benefits for autistic children, an unexpected spillover effect that Merzenich has attempted to harness by developing a modification of Fast ForWord specifically designed for autism.[70] At a subsequent company that Merzenich started called Posit Science, Fast ForWord-like brain exercises and other techniques were developed with the aim of sharpening the brains of elderly people by retaining the plasticity of their brains.[71]

HAL in 2001: A Space Odyssey

In the penultimate scene in Stanley Kubrick's 1968 science fiction film 2001: A Space Odyssey, astronaut David Bowman slowly disassembles the mind of an artificial intelligence named HAL by sequentially unplugging its memory banks. Carr likened the emotions of despair expressed by HAL as its mind is disassembled to his own, at the time, cognitive difficulties in engaging with long texts.[4] He felt as if someone was "tinkering with [his] brain, remapping the neural circuitry, reprogramming the memory".[21] HAL had also been used as a metaphor for the "ultimate search engine" in a PBS interview with Google co-founder Sergey Brin as noted in Carr's book The Big Switch, and also Brin's TED talk. Brin was comparing Google's ambitions of building an artificial intelligence to HAL, while dismissing the possibility that a bug like the one that led HAL to murder the occupants of the fictional spacecraft Discovery One could occur in a Google-based artificial intelligence.[25][72][73] Carr observed in his essay that throughout history technological advances have often necessitated new metaphors, such as the mechanical clock engendering the simile "like clockwork" and the age of the computer engendering the simile "like computers". Carr concluded his essay with an explanation as to why he believed HAL was an appropriate metaphor for his essay's argument. He observed that HAL showed genuine emotion as his mind was disassembled while, throughout the film, the humans onboard the space station appeared to be automatons, thinking and acting as if they were following the steps of an algorithm. Carr believed that the film's prophetic message was that as individuals increasingly rely on computers for an understanding of their world their intelligence may become more machinelike than human.[4][21]

Developing view of how Internet use affects cognition

The brain is very specialized in its circuitry and if you repeat mental tasks over and over it will strengthen certain neural circuits and ignore others.

— Gary Small, a professor at UCLA's Semel Institute for Neuroscience and Human Behaviour.[74]

After the publication of Carr's essay, a developing view unfolded in the media as sociological and neurological studies surfaced that were relevant to determining the cognitive impact of regular Internet usage. Challenges to Carr's argument were made frequently. As the two most outspoken detractors of electronic media, Carr and Birkerts were both appealed to by Kevin Kelly to each formulate a more precise definition of the faults they perceived regarding electronic media so that their beliefs could be scientifically verified.[75] While Carr firmly believed that his skepticism about the Internet's benefits to cognition was warranted,[2] he cautioned in both his essay and his book The Big Switch that long-term psychological and neurological studies were required to definitively ascertain how cognition develops under the influence of the Internet.[5][21][76]

Scholars at University College London conducted a study titled "Information Behaviour of the Researcher of the Future", the results of which suggested that students' research habits tended towards skimming and scanning rather than in-depth reading.[77] The study provoked serious reflection among educators about the implications for educational instruction.[78]

In October 2008, new insights into the effect of Internet usage on cognition were gleaned from the results, reported in a press release,[79] of a study conducted by UCLA's Memory and Aging Research Center that had tested two groups of people between the ages of 55 and 76 years old; only one group of which were experienced web users. While they were reading books or performing assigned search tasks their brain activity had been monitored with functional MRI scans, which revealed that both reading and Internet search utilized the same language, reading, memory, and visual regions of the brain; however, it was discovered that those searching the web had stimulated additional decision-making and complex reasoning regions of the brain, with a two-fold increase in these regions in experienced web users compared with inexperienced web users.[80][81][82][83] Gary Small, the director of the UCLA center and lead investigator of the UCLA study, concurrently released the book iBrain: Surviving the Technological Alteration of the Modern Mind, co-authored with Gigi Vorgan, with the press release.

While one set of critics and bloggers used the UCLA study to dismiss the argument raised in Carr's essay,[84][85] another set attempted to determine what conclusions could be drawn from the study about the effects of Internet usage.[86] Among the reflections concerning the possible interpretations of the UCLA study were whether a greater breadth of brain activity—a pattern that was present in the functional MRI scans of users performing Internet searches—in comparison with reading a book, improved or impaired the reading process; and whether the decision-making and complex reasoning skills that seemed to be involved when a user did an Internet search, according to the study, was indicative of the user's quality of thought or the user's ability to solve puzzles.[87][88] Thomas Claburn, in InformationWeek, observed that the study's findings regarding the cognitive impact of regular Internet usage were inconclusive and stated that "it will take time before it's clear whether we should mourn the old ways, celebrate the new, or learn to stop worrying and love the Net".[7]

Varieties of Internet usage

The World Wide Web cannot respond to interrogation unless it contains answers. The answer to a question like "How did John Dalton come up with the atomic theory?" cannot be generated without historical study by experts in historical research methodologies and in knowledge of the principles of chemistry — then deploying the results on the web. Those activities should enhance the researcher's cognitive abilities. The questioner seeking the information, if serious in knowing the answer, will have to engage in lengthy text and argument, which should enhance the questioner's cognitive abilities.

References

  1. Nicholas Carr. Pages and "pages", Rough Type, 2008-06-12. Retrieved on 2008-11-01.
  2. 2.0 2.1 2.2 2.3 2.4 Nicholas Carr. Why Skepticism is Good: My Reply to Clay Shirky, Britannica Blog, 2008-07-17.
  3. 3.0 3.1 3.2 Arnie Cooper. Computing The Cost, The Sun, March 2009.
  4. 4.0 4.1 4.2 4.3 4.4 4.5 Bill Thompson. Changing the way we think, BBC News, 2008-06-17.
  5. 5.0 5.1 5.2 Steve Johnson. Read this if you're easily distracted lately, Chicago Tribune, 2008-06-18. Retrieved on 2009-02-10.
  6. David Aaronovitch. The internet shrinks your brain? What rubbish, The Times, 2008-08-13. Retrieved on 2008-12-01.
  7. 7.0 7.1 Thomas Claburn. Is Google Making Us Smarter?: UCLA researchers report that searching the Internet may help improve brain function., InformationWeek, 2008-10-15. Retrieved on 2008-11-01.
  8. 8.0 8.1 8.2 Motoko Rich. Literacy Debate: Online, R U Really Reading?, The New York Times, 2008-07-27. Retrieved on 2008-11-01.
  9. Sven Birkerts, The Gutenberg Elegies: The Fate of Reading in an Electronic Age (1st ed edition (December 1994) ed.), Faber & Faber, ISBN 9780571198498
  10. 10.0 10.1 Kevin Kelly. The Fate of the Book (and a Question for Sven Birkerts), Britannica Blog (originally posted at Kelly's blog The Technium), 2008-07-25.
  11. 11.0 11.1 Bernard Sharratt. Are There Books in Our Future, The New York Times, 1994-12-18. Retrieved on 2008-11-01.
  12. 12.0 12.1 John Naughton. I Google, therefore I am losing the ability to think, The Observer, 2008-06-22. Retrieved on 2008-10-20.
  13. The Gutenberg Elegies, pp. 17–20
  14. 14.0 14.1 Birkerts 1994, pp. 146–149
  15. John Walsh and Kate Burt. Can intelligent literature survive in the digital age?, The Independent, 2008-09-14. Retrieved on 2008-10-20.
  16. William Leith. We were never meant to read, The Daily Telegraph, 2008-03-28. Retrieved on 2008-11-01.
  17. Wolf 2007, p. 17
  18. 18.0 18.1 Andrea Hiott. Maryanne Wolf, Deep Reading and a Squid, Pulse Berlin.
  19. 19.0 19.1 Template:Cite podcast
  20. 20.0 20.1 Malcolm Ritter. Scientists ask: Is technology rewiring our brains?, International Herald Tribune, 2008-12-03. Retrieved on 2009-02-10.
  21. 21.0 21.1 21.2 21.3 21.4 21.5 Nicholas Carr (July 2008), The Atlantic. Retrieved on 6 October 2008
  22. Maryanne Wolf and Mirit Barzillai (March 2009). "The Importance of Deep Reading". Educational Leadership 66 (6): pp. 32–37.
  23. Maryanne Wolf. Learning to think in a digital world, The Boston Globe, 2007-09-05. Retrieved on 2008-11-01.
  24. Maryanne Wolf. Reading Worrier, Powell's Books. Retrieved on 2007-10-13.
  25. 25.0 25.1 25.2 25.3 25.4 25.5 25.6 Nicholas Carr. 'Is Google Making Us Stupid?': sources and notes, Rough Type, 2008-08-07. Retrieved on 2008-11-01.
  26. Wolf 2007, pp. 60–65
  27. Richard Foreman. The Pancake People, Or, 'The Gods Are Pounding My Head', Edge, 2005-08-03. Retrieved on 2008-11-01.
  28. Damon Darlin. Technology Doesn’t Dumb Us Down. It Frees Our Minds., The New York Times, 2008-09-20.
  29. 29.0 29.1 29.2 29.3 29.4 The Reality Club: On 'Is Google Making Us Stupid' by Nicholas Carr. Edge (2008-07-10). Retrieved on 2008-11-01.
  30. 30.0 30.1 Andrew Keen. Is the Internet killing the American reader?, The Great Seduction, 2008-07-27. Retrieved on 2008-10-15.
  31. 31.0 31.1 31.2 31.3 Scott Esposito. Friday Column: Is Google Making Us Read Worse?, Conversational Reading, 2008-06-20.
  32. Unger 2004, pp. 2–5
  33. Wolf 2007, pp. 35–37
  34. In a comment from Nicholas Carr on Book critic Scott Esposito's column concerning his criticism of Carr's usage of the term 'ideogram', Carr said: "As to 'ideogram,' I agree that there's debate on terminology, but in my article I decided to use the common term. The Oxford American Dictionary defines ideogram in this way: 'a written character symbolizing the idea of a thing without indicating the sounds used to say it, e.g., numerals and Chinese characters.'"
  35. Seth Finkelstein. Nick Carr: 'Is Google Making Us Stupid?', and Man vs. Machine, Infothought, 2008-06-09.
  36. 36.0 36.1 36.2 Clay Shirky. Why Abundance is Good: A Reply to Nick Carr, Britannica Blog, 2008-07-17.
  37. David Wolman. The Critics Need a Reboot. The Internet Hasn't Led Us Into a New Dark Age, Wired, 2008-08-18.
  38. 38.0 38.1 38.2 38.3 Andrew Sullivan. Not So Google Stoopid, Ctd., The Daily Dish, 2008-06-20. Retrieved on 2009-01-15.
  39. 39.0 39.1 Michael Merzenich. Going googly., "On the Brain" blog, 2008-08-11. Retrieved on 2008-11-01.
  40. Gary Small. Letters to the Editor: Our Brains on Google, The Atlantic, October 2008. Retrieved on 2008-11-01.
  41. Eric Schmidt. Video: Google CEO Responds to Atlantic's 'Stoopid' Cover, Ad Age, July 30, 2008.
  42. Nicholas Carr. Google's chief executive worrywart, RoughType, March 8, 2009.
  43. Eric Schmidt. A conversation with Eric Schmidt, CEO of Google, The Charlie Rose Show, March 6, 2009.
  44. Nicholas Carr. Eric Schmidt's second thoughts, RoughType, January 30, 2010.
  45. Google boss worries about teen reading, Agence France-Presse, January 29, 2010.
  46. 46.0 46.1 46.2 46.3 Compiled (with help from Google) by Evan R. Goldstein (2008-07-11). "CRITICAL MASS: Your Brain on Google", The Chronicle of Higher Education. NOTE: Contains excerpts from columnist Margaret Wente, author Jon Udell, blogger Matthew Ingram, book critic Scott Esposito, blogger Seth Finkelstein, technology analyst Bill Thompson, blogger Ben Worthen, and senior editor Andrew Sullivan.
  47. Andrew Sullivan. Google is giving us pond-skater minds, The Times, 2008-06-15. Retrieved on 2008-11-01.
  48. Margaret Wente. How Google ate my brain, The Globe and Mail, 2008-06-17. Retrieved on 2008-07-01.
  49. Leonard Pitts, Jr.. Reader finds satisfaction in a good read, Miami Herald, 2008-06-15. Retrieved on 2009-02-10.
  50. 50.0 50.1 Jon Udell. A quiet retreat from the busy information commons, Strategies for Internet citizens, 2008-06-10. Retrieved on 2008-11-01.
  51. John Battelle. Google: Making Nick Carr Stupid, But It's Made This Guy Smarter, Searchblog, 2008-06-10. Retrieved on 2008-11-01.
  52. Scott Rosenberg. Nick Carr’s new knock on the Web: does it change how we read?, Wordyard, 2008-06-11. Retrieved on 2008-11-01.
  53. 53.0 53.1 Carr 2008, pp. 22–23
  54. Carr 2008, pp. 21–22
  55. 55.0 55.1 Clay Shirky. Why Abundance Should Breed Optimism: A Second Reply to Nick Carr, Britannica Blog, 2008-07-21.
  56. Larry Sanger. A Defense of Tolstoy & the Individual Thinker: A Reply to Clay Shirky, Britannica Blog, 2008-07-18.
  57. Larry Sanger. The Internet and the Future of Civilization, Britannica Blog, 2008-07-30.
  58. 58.0 58.1 58.2 Evan Ratliff. Are you losing your memory thanks to the Internet?, Salon.com, 2008-08-14.
  59. Sven Birkerts. Reading in the Open-ended Information Zone Called Cyberspace: My Reply to Kevin Kelly, Britannica Blog, 2008-07-25.
  60. Ong 1982, p. 79
  61. Lowry 1979, pp. 29–31
  62. Sven Birkerts. A Know-Nothing’s Defense of Serious Reading & Culture: A Reply to Clay Shirky, Britannica Blog, 2008-07-18.
  63. Ben Worthen. Does the Internet Make Us Think Different?, The Wall Street Journal, 2008-07-11.
  64. Kwansah-Aidoo 2005, pp. 100–101
  65. Friedrich Nietzsche and his typewriter - a Malling-Hansen Writing Ball. The International Rasmus Malling-Hansen Society. Retrieved on 2008-11-26.
  66. Kittler 1999, pp. 203, 206
  67. Kevin Kelly. Will We Let Google Make Us Smarter?, The Technium, 2008-06-11. Retrieved on 2008-11-01.
  68. Doidge 2007, pp. 45–48
  69. Doidge 2007, pp. 70–72
  70. Doidge 2007, pp. 74–83
  71. Doidge 2007, pp. 84–91
  72. Carr 2008, p. 213
  73. Spencer Michaels, "The Search Engine that Could", The NewsHour with Jim Lehrer, November 29, 2002.
  74. Belinda Goldsmith. Is surfing the Internet altering your brain?, Reuters, 2008-10-27. Retrieved on 2008-11-01.
  75. Kevin Kelly. Time to Prove the Carr Thesis: Where’s the Science?, Britannica Blog, 2008-07-25. Retrieved on 2008-11-01.
  76. Carr 2008, p. 227
  77. Ciber, "Information Behaviour of the Researcher of the Future : A Ciber Briefing Paper" (University College London, 2008), p .31.
  78. Meris Stansbury. Rethinking research in the Google era: Educators ponder how the internet has changed students' reading habits, eSchool News, 2008-10-15. Retrieved on 2008-11-01.
  79. UCLA Newsroom (2008-10-14). UCLA study finds that searching the Internet increases brain function. Press release. Retrieved on 2008-11-01.
  80. Mary Brophy Marcus. Internet search results: Increased brain activity, USA Today, 2008-10-15. Retrieved on 2008-11-01.
  81. Madison Park. Study: Google does a brain good, CNN, 2008-10-14. Retrieved on 2008-10-20.
  82. Jeneen Interlandi. Reading This Will Change Your Brain, Newsweek, 2008-10-14. Retrieved on 2008-11-01.
  83. Internet use 'good for the brain', BBC News, 2008-10-14. Retrieved on 2008-11-01.
  84. John Battelle. Google Makes You Smarter? Hey, Who Said That?, Searchblog, 2008-10-14. Retrieved on 2008-11-01.
  85. Betsy Schiffman. Study: Google Makes You Smart, Wired, 2008-10-15. Retrieved on 2008-11-01.
  86. Steve Johnson. Searching for meaning in brain scans of seniors, Chicago Tribune, 2008-10-28. Retrieved on 2008-11-01.
  87. Nicholas Carr. Googling and intelligence, Rough Type, 2008-10-17. Retrieved on 2008-11-01.
  88. Is Google Making Us Smarter?, The New York Times, Freakonomics Blog, 2008-10-16. Retrieved on 2008-11-01.