Read I Can Hear You Whisper Online
Authors: Lydia Denworth
KEY:
SO: Superior olive; IC: Inferior colliculus; MGB: Medial geniculate body
The cochlear nuclei sort the incoming auditory signal along two different tracks, and the various types of cells of the cochlear nuclei each specialize in just one kind of signal. The organization by tone of the basilar membrane, for instance, is repeated in the brain stem, so that some cells respond to high-frequency sounds and others to low-frequency sounds. It is thought that features such as your ability to tell where a sound came from or the fact that you jump at loud noises can be traced to specific cells in the dorsal and ventral pathways respectively. From the cochlear nuclei, sound signals follow these two parallel pathwaysâalong the back and the belly of the brainâin an intricate and complex route passing through regions such as the superior olive (really) and the medial geniculate until they reach the auditory cortex in the temporal lobe, just above the ear where the sound started. Here, too, there are specialized cells to extract features of the sound and make sense of it. With practice, the auditory cortex gets better and better at sophisticated listening, which is why a trained technician, for example, can tune a piano by ear.
If we wanted Alex to be able to listen and talk, he needed first to hear sounds (to recognize simply that someone is speaking) and then to make sense of those sounds (to someday follow a teacher's explanation of how to do long division). The former would require a jet assist from technology. The latter would require practice and time.
 â¢Â â¢Â â¢Â
Information about plasticity first really reached parents in the mid-1990s, just when I was thinking about having my first child. It was easy to freak out about the responsibilities now incumbent on parents to oversee the creation of perfect brains. The possibilities for learning seemed to be pushed to younger and younger ages. Now it wasn't enough to nurture and care for your child; a parent had to be always considering how to maximize learning potential. Toy manufacturers take advantage of this worry by selling us as many “enriching” and “brain-boosting” products as possible.
Frankly, most middle-class, educated, professional parents like me have the luxury of thinking in terms of enrichment. Our children have the good luck to be born into families where basic needs are easily met; in their early years, these kids are carefully nurtured, fed nutritious meals, talked to, read to, played with. All of that is essential to building strong brain architecture in a way that Baby Mozart tapes and foreign language playgroups are not.
In brain plasticity terms, the flip side of enrichment is deprivation. That's what neurologists call it when a brain lacks stimulation. “
If a system can be influenced by the environment and enhanced, that same system is vulnerable because if the right input isn't available then it won't develop optimally,” says the University of Oregon's Helen Neville, one of the pioneers in the study of plasticity. You can see this plainly in studies that compare certain brain functions in children from different socioeconomic groups.
A study published in 2009 asked both low- and middle-income nine- and ten-year-olds to watch images flashing on a computer. The children were instructed to press a button when a tilted triangle appeared. This is a skill that reflects activity in the prefrontal cortex, which controls planning and executive function. The poorer children were far less able to detect the tilted triangles and block out distractionsâso much so that one of the researchers likened the results to what he sees in stroke victims who have lesions in the prefrontal cortex.
I had generally considered all the hype about enrichment just that: hype. Even so, I was susceptible to worry and guilt. I might have forgotten what a cause-and-effect toy was, but like so many of my friends, I had signed my boys up for baby music classes, read to them daily, owned educational puzzles, and so on.
Suddenly, in Alex's case, I was staring deprivation straight in the face. Alex wasn't talking because not enough sound was traveling the pathways that led to the parts of his brain that deal with hearing.
W
hat exactly did deprivation do to the brain? For a long time, no one really knew and this was mostly a philosophical question, famously encapsulated by Irish scientist and politician William Molyneux, who wrote to John Locke in 1688, What if a blind person who has learned to distinguish a cube from a globe by touch, was suddenly able to see? Would he then be able to recognize these objects by sight? Molyneux and Locke both suspected the answer was no. The question was passed down and contemplated anew through the years like a philosophical folktale. Some philosophers argued yes; others echoed
Molyneux and Locke and said no.
As for neuroscientists, once such a job title existed, they thought about such problems in anatomical and physiological terms. What parts of the brain were involved in perception and learning, and how did they work? Well into the twentieth century, it was believed that the brain was unchangeable beyond early childhood. (The few researchers who produced evidence suggesting otherwise were mostly ignored.) Furthermore, it was thought that structure determined function, that each region of the brain could perform only its assigned task. In 1913, Santiago Ramón y Cajal, whom many consider the father of modern neuroscience, wrote that the adult brain's pathways are
“fixed, ended, immutable.” Even though it was understood that the brains of children were works in progress, the details of how that developmental work got done were mysterious.
A series of experiments with cats in the 1950s and 1960s forced a radical reassessment of accepted ideas about how experience might affect the brain. David Hubel was born and raised in Canada and Torsten Wiesel in Sweden. They met at Johns Hopkins University, where both were conducting research in physiology, and teamed up, ostensibly for a few months; their partnership lasted twenty-five years, most of them spent at Harvard.
Hubel and Wiesel concentrated on vision, beginning with the question: What is it to see? Compared to the machines that let researchers see inside the brain today, they used spectacularly low-tech equipment. Hubel fashioned a lathe himself to make the electrodes they would use to probe the brain cells of cats and monkeys. When they needed a screen on the ceiling on which to project images, they hung white sheets. And they plotted the data they received from their recordings of the electrical activity of individual cells using pencils and paper tacked to the wall.
Their first few years of work, much of it described in a 1962 paper Hubel called their “magnum opus,” laid the groundwork for what we know today about how visual information comes into the brain and what happens to it thereâcell by cell. But they didn't stop there. Their curiosity had been piqued by the fact that, in children, cataracts led to blindness, whereas in adults, they did not. They were also struck by experiments in which animals raised in darkness were left with impaired vision even though there was nothing wrong with their eyes. So after establishing the physiology of normal vision in adult animals, Hubel and Wiesel asked what happens as vision develops in younger animals and, critically, what happens if there's a problem.
They thought the first logical step was to suture shut one eye of young kittens, so that they could compare the information from each eye. In these kittens, they expected vision to develop relatively normally in the non-deprived eye. When the sutures had been removed and they covered up the good eye, the cats were effectively blind in the eye that had been deprived of vision, even though there was technically nothing wrong with the eye. The damage was obvious. The first cat they experimented with fell off the table when released. “
That is something,” wrote Hubel, “that no self-respecting cat would do.” When they sacrificed the animals and looked at their brains, though, they were surprised to find that the cells in the visual cortex looked radically different from those in cats with normal vision. The good eye had co-opted much of the neurological territory of the deprived eye like a vine that spreads into the neighbor's yard. Was it from a failure to develop or a withering from disuse of connections that were already formed? They weren't sure, but proposed the latter. Hubel and Wiesel also discovered that some of the visual connections wired up in spite of disuse, which meant that some of the neurological functioning had to be innate.
The next round of experiments involved suturing both eyes of kittens. (Science can undoubtedly be cruel.) After about three months, they reopened the cats' eyes. Again, the cats were effectively blind, even though there was nothing wrong with their eyes. The problem lay in their brains, which had no experience processing visual information and were no longer capable of the task. Finally, Hubel and Wiesel wanted to know how variations in the length of deprivation or the moment of its onset made a difference. Cats hardly use their eyes in the first three weeks after birth, but their visual systems are fairly mature by three months of age. It turned out that if kittens were deprived of sight at any point during the fourth through the eighth week of development, the damage was considerable. After three months of age, however, a cat could have its eyes sutured for months with no appreciable consequences.
This seminal work in vision established the idea of “critical periods” in brain development, called sensitive periods today. Synaptic connections in the brain, it seemed, are best created within a certain window of time, but if an area of the brain is deprived of stimulation beyond that sensitive period, it will reorganize and get used for something else.
 â¢Â â¢Â â¢Â
Naturally, I wondered what happens in deafness. Until the advent of cochlear implants, it wasn't possible to take away and then restore hearing as Hubel and Wiesel did with vision in cats. Even with earplugs, there is some hearing, as anyone who has tried to nap near a construction site knows. Thanks in large part to Hubel and Wiesel, the visual cortex, located at the back of the brain, remained the best understood part of the brain for yearsâit still is, really. The secrets of the auditory system, which concentrates in the temporal area, have taken more time to unlock.
Helen Neville was one of the first neuroscientists to demonstrate how the brain changes in deafness. Neville grew up in Canada and didn't start out wanting to be a scientist. A product of the 1960s, she wanted to be a revolutionary and bring about social justice. “I was a rabble-rouser. I didn't want to go to university,” she told me when we met in her office at the University of Oregon. She was in the middle of finishing several important grant applications, so we were pressed for time, and she spoke in rapid-fire sentences; I suspected she might talk the same way on a lazy Sunday afternoon or over a glass of wine. Her passionate, salty personalityâfour-letter words and phrases like “crazy-ass religion” were sprinkled through her conversationâwas evident. “I just wanted to change the world,” she said. She wasn't enough of a rebel to buck her family's insistence that she get an education, however, so she stayed in school. Naturally curious, she took a wide range of courses, including one in something called neurophysiology, an exploration of the workings of the brain. “I found out that ideas were chemical cascades in the brain,” she says. “That's when I had this epiphany.
What?!
That's what ideas are? I need to know about that.”
After college, she was still a rabble-rouser. “I went to Montreal to try to start the revolution. But, you know, I couldn't start the revolution. Or get a job.” When she found out she could get paid to go to graduate school, she did that instead. “I realized that if I really did think that I could change ideas and ultimately the world, then I had to know how changeable the brain was. Ideas are your brain, your mind is your brain. I needed to know as much as I could about the changeability of your mind and your brain.” She would bring the revolution to the laboratory.
At the time, “everybody thought the brain was determined and fixed and organized at or before birth,” says Neville. Even Hubel and Wiesel, who had shown such dramatic alterations in the brains of kittens, believed that they had found the limits of change. “Of course it wouldn't change with experience” is how Neville recounts the thinking. “How could you know the brain if it was changing every time you turned around?” She put her mind to the problem of how to show that the reigning dogma was wrong. Her solution: to look at the brains of deaf people. Says Neville: “They'd had such extremely different experiences that if their brains weren't different, maybe what all those guys were saying was right.”
Although it was popularly imagined that deaf people's sight improved and blind people's hearing was sharper, scientists had been disappointed in their search for evidence of any actual compensatory changes. Studies of the blind had found that they could not necessarily hear softer sounds than sighted people could. And the deaf had not been shown to be any better at perceiving contrast, seeing in dimmer light, or perceiving subtle motion in slow-moving objects. Neville wondered if her predecessors had been looking in the wrong places, measuring the wrong skills. It was 1983 by the time she started on this work. The technological innovations such as fMRI that would transform neuroscienceâand Neville's own workâwere still another decade away. So she used the most advanced technique available: electroencephalography, EEG, a measure of evoked potentials on the scalp. She calls it “eavesdropping on the brain's electrical conversation.” A potentialâthe metaphorical significance of the term is appealingâis a measurement of the electrical activity in the brain or roughly how many neurons fire and when. An EEG generates a line that moves up and down representing brain wave activity. Electrodes are glued all over a subject's head and scalp (today there are complicated net caps with all the electrodes sewn in to make the job easier), and the subject is asked to look or listen. The waveforms that are generated indicate how quickly the brain responded to whatever the subject heard or saw.
Using a group of subjects who had been deaf from birth, and a group of hearing controls, Neville set up an experiment in which she told the participants to look straight ahead, and then she repeatedly flashed a light off to the side, where it could be seen only in the peripheral vision. Electrical responses to the flashes of light were two to three times higher in deaf brains than in hearing brains. Even more intriguing was the location of the response. It was not primarily over the visual cortex, “
which is where any well-behaved brain should be registering flashes of light,” wrote Sharon Begley when she described the experiment in her book,
Train Your Mind, Change Your Brain
. Instead, the responses were over the auditory cortex, which conventional wisdom said should be sitting dormant in a deaf person who heard no sound.
Captivated, Neville launched a series of studies trying to pin down what brain functions might have been enhanced in both deaf and blind subjects in examples of what researchers call “compensatory plasticity.” Those studies, which some of her postdoctoral fellows have gone on to pursue in their own laboratories, showed that for the deaf, enhancement came in two areas: peripheral vision and motion processing. In one study, for example, Neville had subjects watch a white square on a computer screen and try to detect which way it was moving. If the square was in the central field of vision, there was no difference between the deaf and the hearing. But in the periphery, the deaf were faster and more accurate than the hearing subjects. “All of the early studies had just studied the center,” says Neville.
In addition, those previous studies included people who had lost their hearing to meningitis or encephalitis. “They were dealing already with an altered brain,” she says dismissively. To avoid confounding variables like that, Neville tested groups of subjects, all of whom had deaf parents. Some subjects were deaf from birth. Others were their hearing siblings, who learned ASL as their first language but had no auditory deprivation. “You want to know what's due to auditory deprivation and what's due to visual and spatial language,” she explains. The brains of those hearing native signers did not show the same changes as their deaf brothers' and sisters'. “That's how we sorted it out,” says Neville, beaming. “It's beautiful, so beautiful!”
She also wanted to explore in more detail the question of what parts of the brain deaf and blind people were using, a question that got easier to answer with the advent of functional MRI in the early 1990s. While EEGs are useful for indicating the timing of a response, fMRI does a better job of pinpointing its location. MRI, without the “functional” prefix, uses the magnetic field to take pictures inside the body and is used diagnostically by clinicians for all manner of injuries and illnesses. fMRI is different. It produces what is essentially an afterimage of neural activity by showing contrasting areas of blood flow in the brain. By watching how blood moves around the brainâor, more precisely, how the level of oxygen in the blood rises or fallsâresearchers can tell which areas of the brain are active during particular activities. When Neville looked at the brains of deaf and blind subjects with fMRI, it suddenly looked very much as if structure did not necessarily determine function. She found that signing deaf adults were using parts of the left temporal lobe, usually given over to spoken language in hearing people, for visual processing. They were still using that area of the brain for language, but it was a manual, visual language. In blind subjects, the reverse was true: They had enhanced auditory attention processing in certain areas of the visual cortex.
For a signing deaf person, capitalizing on unused portions of the brain makes sense. It seems likely that we start out with redundant connections between auditory and visual areas, and that they can be tweaked by experience, but that the overlap gradually decreases. If you want to use a cochlear implant, however, you need to maximize the auditory cortex's original purpose: hearing. What were the critical periods for the auditory cortex? After what point was it forever reorganized? Or was it? These were the questions Helen Neville proposed and other researchers set out to answer in the late 1990s, once they had enough subjects with cochlear implants to study the issue.