Read The Universe Within Online

Authors: Neil Turok

The Universe Within (23 page)

BOOK: The Universe Within
12.27Mb size Format: txt, pdf, ePub
ads

Our society has been shaped by physics' past discoveries to an extent that is seldom appreciated. The mechanical world of Newton led to mechanical ways of learning, as well as to the modern industrial age. We are all very aware of how the digital revolution is transforming our lives: computers are filling our schools and offices, replacing factory workers, miners, and farmers. They are changing the way we work, learn, live, and think. Where did this new technology come from? It came, once again, from our capacity to understand, invent, and create: from the Universe Within.

THE STORY OF HOW
physics created the information age begins at the turn of the twentieth century, when electricity was becoming the lifeblood of modern society. There was galloping demand to move current around — quickly, predictably, safely — in light bulbs, radios, telegraphs, and telephones. Joseph John (J. J.) Thomson's discovery of the electron in 1897 had explained the nature of electricity and launched the development of vacuum tubes.

For most of the twentieth century, amplifying vacuum tubes were essential components of radios, telephone equipment, and many other electrical devices. They consist of a sealed glass tube with a metal filament inside that releases lots of electrons when it is heated up. The negatively charged electrons stream towards a positively charged metal plate at the other end of the tube, carrying the electrical current. This simple arrangement, called a “diode,” allows current to flow only one way. In more complicated arrangements, one or more electrical grids are inserted between the cathode and anode. By varying the voltage on the grids, the flow of electrons can be controlled: if things are arranged carefully, tiny changes in the grid voltage result in large changes in the current. This is an amplifier: it is like controlling the flow of water from a tap. Gently twiddling the tap back and forth leads to big changes in the flow of water.

Vacuum tubes were used everywhere — in radios, in telephone and telegraph exchanges, in televisions and the first computers. However, they have many limitations. They are large and have to be warmed up. They use lots of power, and they run hot. Made of glass, they are heavy, fragile, and expensive to manufacture. They are also noisy, creating a background “hum” of electrical noise in any device using them.

In Chapter One, I described the Scottish Enlightenment and how it led to a flowering of education, literature, and science in Scotland. James Clerk Maxwell was one of the products of this period, as were the famous engineers James Watt, William Murdoch, and Thomas Telford; the mathematical physicists Peter Guthrie Tait and William Thomson (Lord Kelvin); and the writer Sir Walter Scott. Another was Alexander Graham Bell, who followed Maxwell to Edinburgh University before emigrating to Canada, where he invented the telephone in Brantford, Ontario — and in so doing, launched global telecommunications.

Bell believed in the profound importance of scientific research, and just as his company was taking off in the 1880s, he founded a research laboratory. Eventually christened Bell Labs, this evolved into the research and development wing of the U.S. telecommunications company
AT&T
, becoming one of the most successful physics centres of all time, with its scientists winning no fewer than seven Nobel Prizes.
83

At Bell Labs, the scientists were given enormous freedom, with no teaching duties, and were challenged to do exceptional science. They were led by a visionary, Mervin Kelly, who framed Bell Labs as an “institute for creative technology,” housing physicists, engineers, chemists, and mathematicians together and allowing them to pursue investigations “sometimes without concrete goals, for years on end.”
84
Their discoveries ranged from the basic theory of information and communication and the first cellular telephones to the first detection of the radiation from the big bang; they invented lasers, computers, solar cells,
CCD
s, and the first quantum materials.

One of quantum theory's successes was to explain why some materials conduct electricity while others do not. A solid material consists of atoms stacked together. Each atom consists of a cloud of negatively charged electrons orbiting a positively charged nucleus. The outermost electrons are farthest from the nucleus and the least tightly bound to it — in conducting materials like metals, they are free to wander around. Like the mol­ecules of air in a room, the free electrons bounce around continuously inside a piece of metal. If you connect a battery across the metal, the free electrons drift through it in one direction, forming an electrical current. In insulating materials, there are no free electrons, and no electrical currents can flow.

Shortly after the Second World War, Kelly formed a research group in solid state physics, under William Shockley. Their goal was to develop a cheaper alternative to vacuum tubes, using semiconductors — materials that are poor conductors of electricity. Semiconductors were already being used, for example, in “point-contact” electrical diodes, where a thin needle of metal, called a “cat's whisker,” was placed in contact with a piece of semiconductor crystal (usually lead sulphide or galena). At certain special points on the surface, the contact acts like a diode, allowing current to flow only one way. Early “crystal” radio sets used these diodes to convert “amplitude modulated” AM radio signals into DC currents, which then drove a headset or earphone. In the 1930s, Bell scientists explored using crystal diodes for very high frequency telephone communications.

During the war, lots of effort had gone into purifying semiconductors like germanium and silicon, on the
theory
that removing impurities would reduce the electrical noise.
85
But it was eventually realized that the magic spots where the crystal diode effect works best correspond to
im
purities in the material. This was a key insight — that controlling the impurities is the secret to the fine control of electrical current.

Just after the war, Shockley had tried to build a semiconductor transistor, but had failed. When Kelly asked Shockley to lead the Solid State Physics group, he placed the theorist John Bardeen and the experimentalist Walter Brattain under his supervision. The two then attempted to develop the “point-contact” idea, using two gold contacts on a piece of germanium which had been “doped” — seeded with a very low concentration of impurities to allow charge to flow through the crystal.

They were confounded by surface effects, which they initially overcame only through the drastic step of immersing the transistor in water, hardly ideal for an electrical device. After two years' work, their breakthrough came in the “miracle month” of November–December 1947, when they wrapped a ribbon of gold foil around a plastic triangle and sliced the ribbon through one of the triangle's points. They then pushed the gold-wrapped tip into the germanium to enable a flow of current through the bulk of the semiconductor. A voltage applied to one of the two gold contacts was then found to amplify the electric current flowing from the other contact into the germanium, like a tap being twiddled to control the flow of water.
86

Bardeen, Brattain, and Shockley shared the 1956 Nobel Prize in Physics for their discovery of the transistor, which launched the modern electronics age. Their “point contact” transistor was quickly superseded by “junction” transistors, eventually to be made from silicon. Soon after, the team split up. Bardeen left for the University of Illinois, where he later won a second Nobel Prize. Shockley moved out to California, where he founded Shockley Semiconductor. He recruited eight talented young co-workers who, after falling out with him, left to form Fairchild and Intel, thereby launching Silicon Valley.

Transistors can control the flow of electricity intricately, accurately, and dependably. They are cheap to manufacture and have become easier and easier to miniaturize. Indeed, to date, making computers faster and more powerful has almost entirely been a matter of packing more and more transistors onto a single microprocessor chip.

For the past forty years, the number of transistors that can be packed onto a one-square-centimetre chip has doubled every two years — an effect known as Moore's law, which is the basis for the information and communication industry's explosive growth. There are now billions of transistors in a typical smartphone or computer
CPU
. But there are also fundamental limits, set by the size of the atom and by Heisenberg's uncertainty principle. Extrapolating Moore's law, transistors will hit these ultimate limits one or two decades from now.

In modern computers, information consists of strings of 0s and 1s stored in a pattern of electrical charges or currents or magnetized states of matter, and then processed via electrical signals according to the computer program's instructions. Typically, billions of operations are performed per second upon billions of memory elements. It is crucial to the computer's operation that the 0s and 1s are stored and changed accurately and not in unpredictable ways.

The problem is that the moving parts of a computer's memory — in particular, the electrons — are not easy to hold still. Heisenberg's uncertainty principle says that if we fix an electron's position, its velocity becomes uncertain and we cannot predict where it will move next. If we fix its velocity, and therefore the electrical current it carries, its position becomes uncertain and we don't know where it is. This problem becomes unimportant when large numbers of electrons are involved, because to operate a device one only needs the average charge or current, and for many electrons these can be predicted with great accuracy. However, when circuits get so tiny that only a few electrons are involved in any process, then their quantum, unpredictable nature becomes the main source of error, or “noise,” in the computer's operations. Today's computers typically store one bit of data in about a million atoms and electrons, although scientists at
IBM
Labs have made a twelve-atom bit register called “atomic-scale memory.”
87

QUANTUM UNCERTAINTY IS THE
modern version of the impurities in semiconductors. Initially impurities were seen as a nuisance, and large sums of money were spent trying to clean them away, before it was realized that the ability to manipulate and make use of them was the key to the development of cheap, reliable transistors. The same story is now repeating itself with “quantum uncertainty.” As far as classical computers are concerned, quantum uncertainty is an unremovable source of noise, and nothing but a nuisance. But once we understand how to use quantum uncertainty instead of trying to fight it, it opens entirely new horizons.

In 1984, I was a post-doctoral researcher at the University of California, Santa Barbara. It was announced that the great Richard Feynman was going to come and give a talk about quantum computers. Feynman was one of our heroes, and this was an opportunity to see him first-hand. Feynman's talk focused on the question of whether there are ultimate limits to computation. Some scientists had speculated that each operation of a computer inevitably consumes a certain amount of energy, and that ultimately this would limit the size and power of any computer. Feynman's interest was piqued by this challenge, and he came up with a design that overcame any such limit.

There were several aspects to his argument. One was the idea of a “reversible” computer that never erased (or overwrote) anything stored in its memory. It turns out that this is enough to overcome the energy limit. The other new idea was how to perform computations in truly quantum ways. I vividly remember him waving his arms (he was a great showman), explaining how the quantum processes ran forwards and backwards and gave you just what you needed and no more.

Feynman's talk was entirely theoretical. He didn't speak at all about building such a device. Nor did he give any specific examples of what a quantum computer would be able to do that a classical computer could not. His discussion of the theory was quite basic, and most of the ingredients could be found in any modern textbook. In fact, there was really no reason why all of this couldn't have been said many decades ago. This is entirely characteristic of quantum theory: simply because it is so counterintuitive, new and unexpected implications are still being worked out today. Although he did not have any specific examples of the uses of a quantum computer, Feynman got people thinking just by raising the possibility. Gradually, more and more people started working on the idea.

In 1994, there came a “bolt from the blue.” U.S. mathematician Peter Shor, working at Bell Labs (perhaps unsurprisingly!), showed mathematically that a quantum computer would be able to find the prime factors of large numbers much faster than any known method on a classical computer. The result caused a shockwave, because the secure encryption of data (vital to the security systems of government, banks, and the internet) most commonly relies on the fact that it is very difficult to find the prime factors of large numbers. For example, if you write down a random 400-digit number (which might take you five minutes), then even with the best known algorithm and the most powerful conceivable classical computer, it would take longer than the age of the universe to discover the number's prime factors. Shor's work showed that a quantum computer could, in principle, perform the same task in a flash.

What makes a quantum computer so much more powerful than a classical one? A classical computer is an automatic information-processing machine. Information is stored in the computer's memory and then read and manipulated according to pre-specified instructions — the program — also stored in the computer's memory. The main difference between a classical and quantum computer is the way information is stored. In a classical computer, information is stored in a series of “bits,” each one of which can take just two values: either 0 or 1. The number of arrangements of the bits grows exponentially with the length of the string. So whereas there are only two arrangements for a single bit, there are four for two bits, eight for three, and there are nearly a googol (one with a hundred zeros after it) ways of arranging three hundred bits. You need five bits to encode a letter of the alphabet and about two million bits to encode all of the information in a book like this. Today, a typical laptop has a memory capacity measured in gigabytes, around ten billion bits (a byte is eight bits), with each gigabyte of memory capable of storing five thousand books.

BOOK: The Universe Within
12.27Mb size Format: txt, pdf, ePub
ads

Other books

True Colors by Natalie Kinsey-Warnock
Replicant Night by K. W. Jeter
Nowhere Girl by Ruth Dugdall
Beware Beware by Steph Cha
The Cereal Murders by Diane Mott Davidson
Shadow Play by Iris Johansen
The Case Has Altered by Martha Grimes