The Shallows (22 page)

Read The Shallows Online

Authors: Nicholas Carr

BOOK: The Shallows
6.62Mb size Format: txt, pdf, ePub

There’s little reason to believe that this new approach to incubating an intelligent machine will prove any more fruitful than the old one. It, too, is built on reductive assumptions. It takes for granted that the brain operates according to the same formal mathematical rules as a computer does—that, in other words, the brain and the computer speak the same language. But that’s a fallacy born of our desire to explain phenomena we don’t understand in terms we do understand. John von Neumann himself warned against falling victim to this fallacy. “When we talk about mathematics,” he wrote toward the end of his life, “we may be discussing a
secondary
language, built on the
primary
language truly used by our central nervous system.” Whatever the nervous system’s language may be, “it cannot fail to differ considerably from what we consciously and explicitly consider as mathematics.”
65

It’s also a fallacy to think that the physical brain and the thinking mind exist as separate layers in a precisely engineered “architecture.” The brain and the mind, the neuroplasticity pioneers have shown, are exquisitely intertwined, each shaping the other. As Ari Schulman wrote in “Why Minds Are Not like Computers,” a 2009
New Atlantis
article, “Every indication is that, rather than a neatly separable hierarchy like a computer, the mind is a tangled hierarchy of organization and causation. Changes in the mind cause changes in the brain, and vice versa.” To create a computer model of the brain that would accurately simulate the mind would require the replication of “
every
level of the brain that affects and is affected by the mind.”
66
Since we’re nowhere near disentangling the brain’s hierarchy, much less understanding how its levels act and interact, the fabrication of an artificial mind is likely to remain an aspiration for generations to come, if not forever.

Google is neither God nor Satan, and if there are shadows in the Googleplex they’re no more than the delusions of grandeur. What’s disturbing about the company’s founders is not their boyish desire to create an amazingly cool machine that will be able to outthink its creators, but the pinched conception of the human mind that gives rise to such a desire.

Search, Memory

S
ocrates was right. As people grew accustomed to writing down their thoughts and reading the thoughts others had written down, they became less dependent on the contents of their own memory. What once had to be stored in the head could instead be stored on tablets and scrolls or between the covers of codices. People began, as the great orator had predicted, to call things to mind not “from within themselves, but by means of external marks.” The reliance on personal memory diminished further with the spread of the letterpress and the attendant expansion of publishing and literacy. Books and journals, at hand in libraries or on the shelves in private homes, became supplements to the brain’s biological storehouse. People didn’t have to memorize everything anymore. They could look it up.

But that wasn’t the whole story. The proliferation of printed pages had another effect, which Socrates didn’t foresee but may well have welcomed. Books provided people with a far greater and more diverse supply of facts, opinions, ideas, and stories than had been available before, and both the method and the culture of deep reading encouraged the commitment of printed information to memory. In the seventh century, Isidore, the bishop of Seville, remarked how reading “the sayings” of thinkers in books “render[ed] their escape from memory less easy.”
1
Because every person was free to chart his own course of reading, to define his own syllabus, individual memory became less of a socially determined construct and more the foundation of a distinctive perspective and personality. Inspired by the book, people began to see themselves as the authors of their own memories. Shakespeare has Hamlet call his memory “the book and volume of my brain.”

In worrying that writing would enfeeble memory, Socrates was, as the Italian novelist and scholar Umberto Eco says, expressing “an eternal fear: the fear that a new technological achievement could abolish or destroy something that we consider precious, fruitful, something that represents for us a value in itself, and a deeply spiritual one.” The fear in this case turned out to be misplaced. Books provide a supplement to memory, but they also, as Eco puts it, “challenge and improve memory; they do not narcotize it.”
2

The Dutch humanist Desiderius Erasmus, in his 1512 textbook
De Copia
, stressed the connection between memory and reading. He urged students to annotate their books, using “an appropriate little sign” to mark “occurrences of striking words, archaic or novel diction, brilliant flashes of style, adages, examples, and pithy remarks worth memorizing.” He also suggested that every student and teacher keep a notebook, organized by subject, “so that whenever he lights on anything worth noting down, he may write it in the appropriate section.” Transcribing the excerpts in longhand, and rehearsing them regularly, would help ensure that they remained fixed in the mind. The passages were to be viewed as “kinds of flowers,” which, plucked from the pages of books, could be preserved in the pages of memory.
3

Erasmus, who as a schoolboy had memorized great swathes of classical literature, including the complete works of the poet Horace and the playwright Terence, was not recommending memorization for memorization’s sake or as a rote exercise for retaining facts. To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.” Far from being a mechanical, mindless process, Erasmus’s brand of memorization engaged the mind fully. It required, Rummel writes, “creativeness and judgment.”
4

Erasmus’s advice echoed that of the Roman Seneca, who also used a botanical metaphor to describe the essential role that memory plays in reading and in thinking. “We should imitate bees,” Seneca wrote, “and we should keep in separate compartments whatever we have collected from our diverse reading, for things conserved separately keep better. Then, diligently applying all the resources of our native talent, we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state.”
5
Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.

Erasmus’s recommendation that every reader keep a notebook of memorable quotations was widely and enthusiastically followed. Such notebooks, which came to be called “commonplace books,” or just “commonplaces,” became fixtures of Renaissance schooling. Every student kept one.
6
By the seventeenth century, their use had spread beyond the schoolhouse. Commonplaces were viewed as necessary tools for the cultivation of an educated mind. In 1623, Francis Bacon observed that “there can hardly be anything more useful” as “a sound help for the memory” than “a good and learned Digest of Common Places.” By aiding the recording of written works in memory, he wrote, a well-maintained commonplace “supplies matter to invention.”
7
Through the eighteenth century, according to American University linguistics professor Naomi Baron, “a gentleman’s commonplace book” served “both as a vehicle for and a chronicle of his intellectual development.”
8

The popularity of commonplace books ebbed as the pace of life quickened in the nineteenth century, and by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century—audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives—greatly expanded the scope and availability of “artificial memory.” Committing information to one’s own mind seemed ever less essential. The arrival of the limitless and easily searchable data banks of the Internet brought a further shift, not just in the way we view memorization but in the way we view memory itself. The Net quickly came to be seen as a replacement for, rather than just a supplement to, personal memory. Today, people routinely talk about artificial memory as though it’s indistinguishable from biological memory.

Clive Thompson, the
Wired
writer, refers to the Net as an “outboard brain” that is taking over the role previously played by inner memory. “I’ve almost given up making an effort to remember anything,” he says, “because I can instantly retrieve the information online.” He suggests that “by offloading data onto silicon, we free our own gray matter for more germanely ‘human’ tasks like brainstorming and daydreaming.”
9
David Brooks, the popular
New York Times
columnist, makes a similar point. “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”
10

Peter Suderman, who writes for the
American Scene
, argues that, with our more or less permanent connections to the Internet, “it’s no longer terribly efficient to use our brains to store information.” Memory, he says, should now function like a simple index, pointing us to places on the Web where we can locate the information we need at the moment we need it: “Why memorize the content of a single book when you could be using your brain to hold a quick guide to an entire library? Rather than memorize information, we now store it digitally and just remember what we stored.” As the Web “teaches us to think like it does,” he says, we’ll end up keeping “rather little deep knowledge” in our own heads.
11
Don Tapscott, the technology writer, puts it more bluntly. Now that we can look up anything “with a click on Google,” he says, “memorizing long passages or historical facts” is obsolete. Memorization is “a waste of time.”
12

Our embrace of the idea that computer databases provide an effective and even superior substitute for personal memory is not particularly surprising. It culminates a century-long shift in the popular view of the mind. As the machines we use to store data have become more voluminous, flexible, and responsive, we’ve grown accustomed to the blurring of artificial and biological memory. But it’s an extraordinary development nonetheless. The notion that memory can be “outsourced,” as Brooks puts it, would have been unthinkable at any earlier moment in our history. For the Ancient Greeks, memory was a goddess: Mnemosyne, mother of the Muses. To Augustine, it was “a vast and infinite profundity,” a reflection of the power of God in man.
13
The classical view remained the common view through the Middle Ages, the Renaissance, and the Enlightenment—up to, in fact, the close of the nineteenth century. When, in an 1892 lecture before a group of teachers, William James declared that “the art of remembering is the art of thinking,” he was stating the obvious.
14
Now, his words seem old-fashioned. Not only has memory lost its divinity; it’s well on its way to losing its humanness. Mnemosyne has become a machine.

The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer. If biological memory functions like a hard drive, storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but, as Thompson and Brooks argue, liberating. It provides us with a much more capacious memory while clearing out space in our brains for more valuable and even “more human” computations. The analogy has a simplicity that makes it compelling, and it certainly seems more “scientific” than the suggestion that our memory is like a book of pressed flowers or the honey in a beehive’s comb. But there’s a problem with our new, post-Internet conception of human memory. It’s wrong.

 

AFTER DEMONSTRATING, IN
the early 1970s, that “synapses change with experience,” Eric Kandel continued to probe the nervous system of the lowly sea slug for many years. The focus of his work shifted, though. He began to look beyond the neuronal triggers of simple reflex responses, such as the slug’s withdrawal of its gill when touched, to the much more complicated question of how the brain stores information as memories. Kandel wanted, in particular, to shed light on one of the central and most perplexing riddles in neuroscience: how, exactly, does the brain transform fleeting short-term memories, such as the ones that enter and exit our working memory every waking moment, into the long-term memories that can last a lifetime?

Neurologists and psychologists had known since the end of the nineteenth century that our brains hold more than one kind of memory. In 1885, the German psychologist Hermann Ebbinghaus conducted an exhausting series of experiments, using himself as the sole subject, that involved memorizing two thousand nonsense words. He discovered that his ability to retain a word in memory strengthened the more times he studied the word and that it was much easier to memorize a half dozen words at a sitting than to memorize a dozen. He also found that the process of forgetting had two stages. Most of the words he studied disappeared from his memory very quickly, within an hour after he rehearsed them, but a smaller set stayed put much longer—they slipped away only gradually. The results of Ebbinghaus’s tests led William James to conclude, in 1890, that memories were of two kinds: “primary memories,” which evaporated from the mind soon after the event that inspired them, and “secondary memories,” which the brain could hold onto indefinitely.
15

At around the same time, studies of boxers revealed that a concussive blow to the head could bring on retrograde amnesia, erasing all memories stored during the preceding few minutes or hours while leaving older memories intact. The same phenomenon was noted in epileptics after they suffered seizures. Such observations implied that a memory, even a strong one, remains unstable for a brief period after it’s formed. A certain amount of time seemed to be required for a primary, or short-term, memory to be transformed into a secondary, or long-term, one.

That hypothesis was backed up by research conducted by two other German psychologists, Georg Müller and Alfons Pilzecker, in the late 1890s. In a variation on Ebbinghaus’s experiments, they asked a group of people to memorize a list of nonsense words. A day later, they tested the group and found that the subjects had no problem recalling the list. The researchers then conducted the same experiment on another group of people, but this time they had the subjects study a second list of words immediately after learning the first list. In the next day’s test, this group was unable to remember the initial set of words. Müller and Pilzecker then conducted one last trial, with another twist. The third group of subjects memorized the first list of words and then, after a delay of two hours, were given the second list to study. This group, like the first, had little trouble remembering the initial list of words the next day. Müller and Pilzecker concluded that it takes an hour or so for memories to become fixed, or “consolidated,” in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can sweep the nascent memories from the mind.
16

Other books

Wheels Within Wheels by Dervla Murphy
Club Wonderland by d'Abo, Christine
Tapestry of Spies by Stephen Hunter
The Star Plume by Kae Bell
A Creature of Moonlight by Rebecca Hahn
Slave to Passion by Elisabeth Naughton