Why the West Rules--For Now (95 page)

Read Why the West Rules--For Now Online

Authors: Ian Morris

Tags: #History, #Modern, #General, #Business & Economics, #International, #Economics

BOOK: Why the West Rules--For Now
9.67Mb size Format: txt, pdf, ePub

The inventor and futurist Ray Kurzweil calls this the Singularity—“
a future period
during which the pace of technological change will be so rapid, its impact so deep … that technology appears to be expanding at infinite speed.” One of the foundations of his argument is Moore’s Law, the famous observation made by the engineer (and future chairman of Intel) Gordon Moore in 1965 that with every passing year the miniaturization of computer chips roughly doubled their speed and
halved their cost. Forty years ago gigantic mainframes typically performed a few hundred thousand calculations per second and cost several million dollars, but the little thousand-dollar laptop I am now tapping away on can handle a couple of billion per second—a ten-million-fold improvement in price-performance, or a doubling every eighteen months, much as Moore predicted.

If this trend continues, says Kurzweil, by about 2030 computers will be powerful enough to run programs reproducing the 10,000 trillion electrical signals that flash every second among the 22 billion neurons inside a human skull. They will also have the memory to store the 10 trillion recollections that a typical brain houses. By that date scanning technology will be accurate enough to map the human brain neuron by neuron—meaning, say the technology boosters, that we will be able to upload actual human minds onto machines. By about 2045, Kurzweil thinks, computers will be able to host all the minds in the world, effectively merging carbon-and silicon-based intelligence into a single global consciousness. This will be the Singularity. We will transcend biology, evolving into a new, merged being as far ahead of
Homo sapiens
as a contemporary human is of the individual cells that merge to create his/her body.

Kurzweil’s enthusiastic vision provokes as much mockery as admiration (“
the Rapture for Nerds
,” some call it), and the odds are that—like all prophets before him—he will be wrong much more often than he is right. But one of the things Kurzweil is surely correct about is that what he calls “
criticism from incredulity
,” simple disbelief that anything so peculiar could happen, is no counterargument. As the Nobel Prize–winning chemist Richard Smalley likes to say, “
When a scientist
says something is possible, they’re probably underestimating how long it will take. But if they say it’s impossible, they’re probably wrong.” Humans are already taking baby steps toward some sort of Singularity, and governments and militaries are taking the prospect of a Singularity seriously enough to start planning for it.

We can, perhaps, already see what some of these baby steps have wrought. I pointed out in
Chapter 10
that the industrial revolution set off even bigger changes in what it means to be human than the agricultural revolution had done. Across much of the world, better diets now allow humans to live twice as long as and grow six inches taller than their great-great-grandparents. Few women now spend more than
a small part of their lives bearing and rearing babies, and compared with any earlier age, few babies now die in infancy. In the richest countries doctors seem able to perform miracles—they can keep us looking young (in 2008, five million Botox procedures were performed in the United States), control our moods (one in ten Americans has used Prozac), and consolidate everything from cartilage to erections (in 2005 American doctors wrote 17 million prescriptions for Viagra, Cialis, and Levitra). The aging emperors of antiquity, I suspect, would have thought these little purple pills quite as wonderful as anything in Kurzweil’s Singularity.

Twenty-first-century genetic research promises to transform humanity even more, correcting copying errors in our cells and growing new organs when the ones we were born with let us down. Some scientists think we are approaching “partial immortalization”: like Abraham Lincoln’s famous ax (which had its handle replaced three times and its blade twice), each part of us might be renewed while we ourselves carry on indefinitely.

And why stop at just fixing what is broken? You may remember the 1970s television series
The Six Million Dollar Man
, which began with a pilot named Steve Austin (played by Lee Majors) losing an arm, an eye, and both legs in a plane crash. “
We can rebuild him
—we have the technology,” says the voiceover, and Austin quickly reappears as a bionic man who outruns cars, has a Geiger counter in his arm and a zoom lens in his eye, and eventually a bionic girlfriend (Lindsay Wagner) too.

Thirty years on, athletes have already gone bionic. When the golfer Tiger Woods needed eye surgery in 2005, he upgraded himself to better-than-perfect 20/15 vision, and in 2008 the International Association of Athletics Federations even temporarily banned the sprinter Oscar Pistorius from the Olympics because his artificial legs seemed to give him an edge over runners hobbled by having real legs.
*

By the 2020s middle-aged folks in the developed cores might see farther, run faster, and look better than they did as youngsters. But they will still not be as eagle-eyed, swift, and beautiful as the next generation. Genetic testing already gives parents opportunities to abort fetuses predisposed to undesirable shortcomings, and as we get better at
switching specific genes on and off, so-called designer babies engineered for traits that parents like may become an option. Why take chances on nature’s genetic lottery, ask some, if a little tinkering can give you the baby you want?

Because, answer others, eugenics—whether driven by racist maniacs like Hitler or by consumer choice—is immoral. It may also be dangerous: biologists like to say that “evolution is smarter than you,” and we may one day pay a price for trying to outwit nature by culling our herd of traits such as stupidity, ugliness, obesity, and laziness. All this talk of transcending biology, critics charge, is merely playing at being God—to which Craig Venter, one of the first scientists to sequence the human genome, reportedly replies: “
We’re not playing
.”

Controversy continues, but I suspect that our age, like so many before it, will in the end get the thought it needs. Ten thousand years ago some people may have worried that domesticated wheat and sheep were unnatural; two hundred years ago some certainly felt that way about steam engines. Those who mastered their qualms flourished; those who did not, did not. Trying to outlaw therapeutic cloning, beauty for all, and longer life spans does not sound very workable, and banning the military uses of tinkering with nature sounds even less so.

The United States Defense Advanced Research Projects Agency (DARPA) is one of the biggest funders of research into modifying humans. It was DARPA that brought us the Internet (then called the Arpanet) in the 1970s, and its Brain Interface Project is now looking at molecular-scale computers, built from enzymes and DNA molecules rather than silicon, that could be implanted in soldiers’ heads. The first molecular computers were unveiled in 2002, and by 2004 better versions were helping to fight cancer. DARPA, however, hopes that more advanced models will give soldiers some of the advantages of machines by speeding up their synaptic links, adding memory, and even providing wireless Internet access. In a similar vein, DARPA’s Silent Talk project is working on implants that will decode preverbal electrical signals within the brain and send them over the Internet so troops can communicate without radios or e-mail. One National Science Foundation report suggests that such “
network-enabled telepathy
” will become a reality in the 2020s.

The final component of Kurzweil’s Singularity, computers that can reproduce the workings of biological brains, is moving even faster. In
April 2007 IBM researchers turned a Blue Gene/L supercomputer into a massively parallel cortical simulator that could run a program imitating a mouse’s brain functions. The program was only half as complex as a real mouse brain, and ran at only one-tenth of rodent speed, but by November of that year the same lab had already upgraded to mimicking bigger, more complex rat brains.

Half a slowed-down rat is a long way from a whole full-speed human, and the lab team in fact estimated that a human simulation would require a computer four hundred times as powerful, which with 2007 technology would have had unmanageable energy, cooling, and space requirements. Already in 2008, however, the costs were falling sharply, and IBM predicted that the Blue Gene/Q supercomputer, which should be up and running in 2011, would get at least a quarter of the way there. The even more ambitious Project Kittyhawk, linking thousands of Blue Genes, should move closer still in the 2020s.

To insist that this will add up to Kurzweil’s Singularity by 2045 would be rash. It might be rasher still, however, to deny that we are approaching a massive discontinuity. Everywhere we look, scientists are assaulting the boundaries of biology. Craig Venter’s much-publicized ambition to synthesize life had earned him the nickname “Dr. Frankencell,” but in 2010 his team succeeded in manufacturing the genome of a simple bacterium entirely from chemicals and transplanting it into the walls of cells to create JCVI-syn1.0, the earth’s first synthetic self-reproducing organism. Genetics even has its own version of Moore’s Law, Carlson’s Curve:
*
between 1995 and 2009 the cost of DNA synthesis fell from a dollar per base pair to less than 0.1 cent. By 2020, some geneticists think, building entirely new organisms will be commonplace. Hard as it is to get our minds around the idea, the trends of the last couple of centuries are leading toward a change in what it means to be human, making possible the vast cities, astonishing energy levels, apocalyptic weapons, and science-fiction kinds of information technology implied by social development scores of five thousand points.

This book has been full of upheavals in which social development jumped upward, rendering irrelevant many of the problems that had dominated the lives of earlier generations. The evolution of
Homo sapiens
swept away all previous ape-men; the invention of agriculture made
many of the burning issues of hunter-gatherer life unimportant; and the rise of cities and states did the same to the concerns of prehistoric villagers. The closing of the steppe highway and the opening of the oceans ended realities that had constrained Old World development for two thousand years, and the industrial revolution of course made mockery of all that had gone before.

These revolutions have been accelerating, building on one another to drive social development up further and faster each time. If development does leap up by four thousand points in the twenty-first century, as
Figure 12.1
predicts, this ongoing revolution will be the biggest and fastest of all. Its core, many futurists agree, lies in linked transformations of genetics, robotics, nanotechnology, and computing, and its consequences will overturn much of what we have known.

But while
Figure 12.1
clearly shows the Eastern development score gaining on the West’s, you may have noticed that every example I cited in this section—DARPA, IBM, the Six Million Dollar Man—was American. Eastern scientists have made plenty of contributions to the new technologies (robotics, for instance, is as advanced in Japan and South Korea as anywhere), but so far the revolution has been overwhelmingly Western. This might mean that the pundits who point to America’s decline and a coming Chinese age will be proved wrong after all: if the United States dominates the new technologies as thoroughly as Britain dominated industrial ones two centuries ago, the genetic/nanotechnology/robotics revolution might shift wealth and power westward even more dramatically than the industrial revolution did.

On the other hand, the underlying shift of wealth from West to East might mean that the current American dominance is just a lag from the twentieth century, and that by the 2020s the big advances will be happening in Eastern labs. China is already using lavish funding to lure its best scientists back from America; perhaps Lenovo, not IBM, will provide the mainframes that host a global consciousness in the 2040s, and
Figure 12.1
will be more or less right after all.

Or then again, perhaps the Singularity will render ten-thousand-year-old categories such as “East” and “West” completely irrelevant. Instead of transforming geography, it might abolish it. The merging of mortals and machines will mean new ways of capturing and using energy, new ways of living together, new ways of fighting, and new ways
of communicating. It will mean new ways of working, thinking, loving, and laughing; new ways of being born, growing old, and dying. It may even mean the end of all these things and the creation of a world beyond anything our unimproved, merely biological brains can imagine.

Any or all these things may come to pass.

Unless, of course, something prevents them.

THE WORST-CASE SCENARIO

Late in 2006, my wife and I were invited to a conference at Stanford University called “A World at Risk.” This star-studded event, featuring some of the world’s leading policy makers, took place on a bright winter’s day. The sun shone warmly from a clear blue sky as we made our way to the venue. The stock market, house prices, employment, and consumer confidence were at or near all-time highs. It was morning in America.

 

Over breakfast we heard from former secretaries of state and defense about the nuclear, biological, and terrorist threats facing us. Before lunch we learned about the shocking scale of environmental degradation and the high risk that international security would collapse, and as we ate we were told that global epidemics were virtually inevitable. And then things went downhill. We reeled from session to session in gathering gloom, overwhelmed by expert report after expert report on the rising tide of catastrophe. The conference had been a tour de force, but by the time the after-dinner speaker announced that we were losing the war on terror, the audience could barely even respond.

Other books

Because of a Girl by Janice Kay Johnson
The Last Debate by Jim Lehrer
Out of the Game3 by Kate Willoughby
Be Sweet by Diann Hunt
Maxwell's Island by M.J. Trow
Illidan by William King