Read Quantum Man: Richard Feynman's Life in Science Online
Authors: Lawrence M. Krauss
Tags: #Science / Physics
This is significant. It ascribed a unique status to a single scientist, and I don’t know of anyone else, in physics at least, for whom such a title would have even been considered. Feynman was in the process of becoming a physics icon, and the title was testimony not only to the nature of the material but also to the special place which Feynman was coming to occupy in the physics world.
In the end the actual course was a mixed success. Few of the students, even Caltech students, could follow all of the material. However, over time those who were lucky enough to attend the course had memories that mellowed. Many of the former students remembered it as the experience of a lifetime, echoing the words of Nobel laureate Douglas Osheroff, who later said, “The two-year sequence was an extremely important part of my education. Although I cannot say that I understood it all, I think it contributed most to my physical intuition.”
But while the undergraduate guinea pigs at Caltech may have suffered (though Sands disputed this notion and argued that most of the students kept up at some level), the
Feynman Lectures
became a staple for anyone who planned to become a physicist. I remember buying my own copy when I was an undergraduate, and I would read tidbits at a time, wondering whether I would ever really grasp all the material, and hoping that one of my professors would use the book. Perhaps luckily for me, none of them did. Most who tried, found the experiment to be disappointing. The material was too demanding for the average physics class, and too revolutionary.
Nevertheless, the books remain in print, a new revised set appeared in 2005, and every year, a new crop of students buys them, opens them up, and begins to experience a whole new world.
Unfortunately, for Feynman, Sands, and Leighton, however, all royalties continue to go to Caltech (although Feynman’s family later sued Caltech over rights to one of the lectures that Caltech packaged as a book and audiotape). Later on, Feynman commiserated with friend, physicist, and author Philip Morrison, after being called a physics giant, “Are we physics giants, business dwarfs?”
The experience of teaching this course coincided with a general outburst of popular activity by Feynman, whose charismatic style was beginning to make waves well beyond the confines of the physics community. Already in 1958 he had agreed to be an advisor of a television program that Warner Brothers was producing, and in a letter regarding that production he indicated both his experience with popular outreach and his philosophy: “The idea that movie people know how to present this stuff, because they are entertainment-wise and the scientists aren’t is wrong. They have no experience in explaining ideas, witness all movies, and I do. I am a successful lecturer in physics for popular audiences. The real entertainment gimmick is the excitement, drama, and mystery of the subject matter. People love to learn something, they are ‘entertained’ enormously by being allowed to understand a little bit of something they never understood before.”
Around that time he also participated in what I believe was his first television interview, which aired shortly before Gweneth arrived in the United States. He was clearly excited about being on television, and advised her, “If you came 2 weeks earlier I’d sure have a lot for you to do—I’m going to be on television, in an interview with a news commentator on June 7th and there may be a lot of letters to answer.” The interview was a masterpiece, far exceeding the quality and intellectual depth of interviews performed nowadays, but because there was a frank discussion of religion in it, the network decided to air it at a different time than advertised, so the viewing audience was smaller.
The television production he had become an advisor on, called
About Time
, finally aired on NBC in 1962. It elicited a large reaction among viewers and began to further establish Feynman’s popular credentials. His prowess as a lecturer to lay audiences led to an invitation to give the prestigious Messenger Lectures at Cornell. This set of six lectures became famous and were compiled into a wonderful book titled
The Character of Physical Law
. (This was the very book that my summer school physics instructor had recommended I read in order to get me more excited about physics.) The lectures were also recorded on film, and recently Bill Gates bought the rights to them so they could be available online. (Gates said that if he had had access to them when he was a student, before he dropped out of Harvard, his life might have changed.) They, more than any other recorded image or document, capture the real Feynman, playful, brilliant, excited, charismatic, energetic, and no nonsense.
Finally, on October 21, 1965, Feynman was “canonized,” cementing his status forever among scientific and lay audiences. Feynman, along with Sin-Itiro Tomonaga and Julian Schwinger, shared the 1965 Nobel Prize for their “fundamental work in quantum electrodynamics with deep ploughing consequences for the physics of elementary particles.” Like that of every other Nobel laureate, Feynman’s life was forever changed, and he worried about this effect. While there is little doubt he enjoyed celebrity, he didn’t like pomp and circumstance, and motivated from the attitude he had gleaned from his father as a child, he truly distrusted honorific titles. He matched his thoughts with actions. He had decided as a young man, graduating from MIT, that honorary degrees were silly—those being honored with the degrees had not done as much work as he had to attain his—so he refused to accept any that were offered to him. In the 1950s he had been elected to membership in the prestigious National Academy of Sciences, for many scientists the highest recognition they can get from their colleagues. Beginning in 1960, Feynman began a long and involved process to resign from the National Academy because he viewed its prime purpose as to determine who could be “in” and who was not “worthy.” (In a famous episode years later, Carl Sagan was rejected for membership, many think at least in part because of his popularization efforts.) He stopped listing it among his honors (he asked the NBC television people, for example, to remove it from his biographical sketch for the 1962 television program), but it took ten years before the Academy officials finally made his resignation official.
It is hard to know how serious Feynman was, but he later wrote that for a moment he considered refusing the Nobel Prize for these same reasons—who cared if someone in the Swedish Academy decided that his work was “noble” enough. As he famously said, “The prize is the pleasure of finding things out.” But he quickly realized that this would generate more publicity than getting the prize, and might lead to the impression that he thought he was “too good” for it. He claimed that what the Nobel Committee should do instead is quietly let the prize winners know in advance of its decision, and give them time to back out quietly. According to Feynman, he wasn’t the only one who had this idea. His idol Dirac had felt the same way.
In spite of his misgivings, Feynman clearly felt some validation by the prize, and as his former student Albert Hibbs said, he probably would have felt worse if he hadn’t gotten it. He also liked the fame the prize and other recognitions brought, not least because it gave him more freedom to act as he wished.
Be that as it may, in spite of his intense nervousness about messing up during the official ceremony, about bowing and wearing formal attire, and about walking backward in the presence of the Swedish king, Feynman persevered, attended the ceremony, and prepared a beautiful Nobel lecture giving a truly personal history of his journey to discover how to tame the infinities of QED. And even as late as 1965, Feynman still felt the program of renormalization that he had spearheaded was merely a way of sweeping problems of infinities under the rug, not curing them.
Associated with the awarding of the prize was the question that often comes about because of the phrasing of Mr. Nobel’s will, which seems to imply that only one person can win. It was clear in this case that Julian Schwinger, Freeman Dyson, and Sin-Itiro Tomonaga all deserved a share of the prize, but why didn’t Dyson receive it? He had so skillfully demonstrated the equivalence of the seemingly totally different methods of deriving a sensible form of QED, and had followed that up by essentially providing a guide to teach the rest of the physics world how to do the appropriate calculations. Dyson, you will recall, was also essentially the one whose papers advertised Feynman’s results before Feynman ever wrote his own papers, and who ultimately helped explain to the world that Feynman’s methods were not ad hoc, but as well grounded and much more physically intuitive and calculationally simple than the other approaches. It was thus Dyson who had helped the rest of the world understand QED, while establishing Feynman’s methods as the ones that would ultimately take root and grow.
If Dyson had bad feelings about not receiving the prize, he never verbalized it. Quite the opposite in fact. As he later put it, “Feynman made the big discoveries, and I was just really a publicizer. I got well rewarded for my part in the business—I got a beautiful job here at the Institute, set up for life, so I’ve nothing to complain about! No, I think that it was entirely right and proper. Feynman’s was one of the best-earned Nobel Prizes there ever was, I would say.”
CHAPTER
15
Twisting the Tail of the Cosmos
I think I’ve got the right idea, to do crazy things . . .
—R
ICHARD
F
EYNMAN
T
he years between 1957 and 1965 represented a transitional period in Richard Feynman’s life. Personally, he went from womanizer to family man, from solitary wanderer to domesticated husband and father (though he never stopped seeking adventures, now, more often than not, with his family). Professionally, he went from someone urgently working, essentially for his own immediate pleasure, at the leading edge of physics to someone who had begun to give back to the world the wisdom he had gained from his years of thinking about nature (though he probably would never have claimed that what he had was wisdom).
In the meantime he had become one of the most well-known and colorful expositors and teachers in physics, and, in part, its conscience. He remained acutely aware, and ensured whenever he could that his colleagues and the public at large did not lose track of what science was and what it wasn’t, what excitement could be gained from studying it, and what nonsense could result from overinterpreting its signals, from unfounded claims, or simply by losing touch with it completely. He felt strongly that science required a certain intellectual honesty, and that the world would be a better place if this was more widely understood and practiced.
This is not to suggest that Feynman the person changed in any fundamental way. He remained intensely interested in all aspects of physics and, as I just mentioned, he continued to seek out adventures, just adventures of a different kind. Besides exotic trips with his wife, he took up two activities that might be called sublimation. One involved working on his calculations almost every day in a Pasadena strip joint, where he could watch the girls whenever calculations were going badly, and another involved combining a long-standing interest in drawing with a long-standing interest in nude women that he could draw. He actually became fairly accomplished at this, which is paradoxical since as a young man he had scoffed at music and art but, by middle age had taken up both. Equally paradoxical was a new fascination, in the 1960s, with visiting various New Age establishments like Esalen, where he would enjoy both the scenery and the opportunity to relieve the participants of their beliefs in “hokey-pokey” fairy tales, as he put it. Perhaps the attraction of the nude baths, and the fascination with interacting with a completely different sort of individual than he would otherwise, outweighed his own lack of tolerance for those who abused the concepts of physics, like quantum mechanics, to justify their “anything goes” mentality.
In his professional life, as his fame increased, he moved to aggressively protect his time. He wanted to ensure that he didn’t become a “great man” in the traditional sense, encumbered by a host of administrative responsibilities, which he shirked at every possible moment. He even had recorded a bet with Victor Weisskopf, when he visited CERN in Geneva after he received the Nobel Prize, that within ten years he would not hold a “responsible position”—that is, a position that “by reasons of its nature, compels the holder to issue instructions to other persons to carry out certain acts, notwithstanding the fact that the holder has no understanding whatsoever of that which he is instructing the aforesaid persons to accomplish.” Needless to say, he won the bet.
His growing fame encouraged another tendency that had paid off for him in the past, though in the long run it cost him what could have been a great deal more success in continuing to lead in discovering new physics. He became more and more convinced that in order to blaze new paths, he needed to disregard much of what others were doing, and in particular not focus on the “problems du jour.”
Physics is, after all, a human social activity, and at any time there is often consensus about what the “hot” problems are, and which directions are most likely to lead to new insights. Some view this faddishness as a problem, as for instance, the fascination of much of the community over the past twenty-five years with string theory, a mathematically fascinating set of ideas whose lack of direct contact with the empirical world has been outweighed only by the increasing confusion about what it might predict about nature. (Nevertheless, in spite of this, the mathematics of string theory has led to a number of interesting insights about how to perform calculations and interpret the results of more conventional physics.)
It is inevitable that groups of people with similar interests will get excited about similar things. And ultimately, fads in science don’t matter, because first, all of the activity inevitably reveals the warts as well as the beauty marks quicker than would otherwise have been the case, and second, as soon as nature points out the right direction, scientists will jump off a sinking ship faster than rats in a storm.
In order for science to be healthy, it is important that not
all
scientists jump on the same bandwagon, and this was the point that Feynman focused on, almost to obsession. He was so talented and so versatile that he was able, if necessary, to reinvent almost any wheel and usually improve it in the process. But by the same token, reinventing the wheel takes time and is rarely worth the trouble.
It wasn’t just that he could take this road; it was that he often felt he had to. This was both a strength and a weakness. He really didn’t trust any idea unless he had worked it out from first principles using his own methods. This meant that he understood a plethora of concepts more deeply and thoroughly than most others, and that he had a remarkable bag of tricks from which he could pull magic solutions to a host of varied problems. However, it also meant that he was not aware of brilliant developments by others that could have illuminated his own work in new ways, leading him further than he could have gotten on his own.
As Sidney Coleman, a brilliant and remarkably well-respected Harvard physicist who had been a student of Gell-Mann’s at Caltech in the 1950s and who had interacted with Feynman throughout his career, put it, “I’m sure Dick thought of that as a virtue, as noble. I don’t think it’s so. I think it’s kidding yourself. Those other guys are not all a collection of yo-yos. . . . I know people who are in fact very original and not cranky but have not done as good physics as they could have done because they were more concerned at a certain juncture with being original than with being right. Dick could get away with a lot because he was so goddamn smart.”
Feynman did get away with a lot. But could he have done much more if he had agreed every now and then to build on well-trodden paths rather than seek out new ones? We will never know. However, an honest assessment of his contributions to science from 1960 or so onward demonstrates several trends that continued to repeat themselves. He would explore a new area, developing a set of remarkably original mathematical techniques and physical insights. These would ultimately contribute to central developments by others, which would lead to a host of major discoveries and essentially drive almost every area of modern theoretical and experimental physics. This ranged from his work in condensed matter physics to our understanding of the weak and strong interactions, to the basis of current work in quantum gravity and quantum computing. But he himself did not make the discoveries or win prizes. In this sense, he continued to push physics forward as few modern scientists have, opening up new areas of study, producing key insights, and creating interest where there had been none before, but he tended to lead from the rear or, at best, from a side flank.
Whether or not this would have disturbed him is unclear. In spite of his natural tendency to showboat, as I have described, he was ultimately more interested in being right than being original, and if his work led others to uncover new truths, he might remain skeptical of their results for a long time, but eventually the satisfaction of having provided illumination in the darkness gave him deep pleasure. And by concentrating on difficult problems the mainstream would not approach, he increased his chances of providing such illumination.
F
EYNMAN’S FIRST FORAY
well off the beaten path involved his desire, beginning around 1960, to understand how one might formulate a quantum theory of gravity. There were good reasons for his interest. First, while developing such a theory had thus far eluded all who had thought about it, he had already been successful in developing a consistent quantum theory of electromagnetism when others had been stymied, and he thought his experience with QED might lead somewhere useful. Second, Einstein’s general relativity had long been considered the greatest scientific development since Newton. It was, after all, a new theory of gravity. But when one considered its behavior on small scales, it appeared to be flawed. The first person who could set this theory straight would surely be viewed as the rightful heir to Einstein. But perhaps the biggest attraction for Feynman was that no one else, at least no one who really mattered, was thinking about the problem. As he put it, in a letter to his wife from a conference on gravity that he attended, in Warsaw in 1962, “This field (because there are no experiments) is not an active one, so few of the best men are doing work in it.”
That was probably somewhat of an overstatement, but in truth the study of general relativity had become a field unto itself since Einstein’s great discovery of his classical field equations in 1915. Because general relativity implies that matter and energy affect the very nature of space itself, allowing it to curve, expand, and contract, and that this configuration of space then affects the subsequent evolution of matter and energy, which then continues to impact on space, and so on, the theory is both mathematically and physically far more complicated than Newton’s theory of gravity had been.
A great deal of work was done to find mathematical solutions to these equations in order to explain phenomena ranging from the dynamics of the universe to the behavior of the last moments of stars as they burn out their nuclear fuel. The equations were complicated enough, and their physical interpretation confusing enough, that tremendous ingenuity and mathematical prowess were required, and a small industry of experts had developed to investigate new techniques to deal with these issues.
To get a sense of how complicated the situation actually was, it took a full twenty years and lots of detours down blind alleys and errors, including some famous ones by Einstein himself, before scientists realized that general relativity was incompatible with a static and eternal universe, which was the preferred scientific picture of the cosmos at the time. In order to allow for such a universe in which our galaxy was surrounded by static empty space, Einstein added his famous cosmological constant (which he later called his biggest blunder).
The Russian physicist Alexander Friedmann first wrote down the equations for an expanding universe in 1924, but for some reason the physics community largely overlooked them. The Belgian priest and physicist Georges Lemaître independently rediscovered the equations and published them in an obscure journal in 1927. While Lemaître’s work did not receive general notice, Einstein certainly was aware of it, and wrote to Lemaître: “Your calculations are correct, but your physics is abominable.”
It was not until the 1930s, after Edwin Hubble’s observation of the expanding universe through the motion of distant galaxies, that Lemaître’s work was translated to English and began to receive general acceptance, including by Einstein. In 1931 Lemaître published his famous article in
Nature
outlining his “primeval atom” model, which eventually became known as the
big bang
. Finally, in 1935, Howard Robertson and Arthur Walker rigorously proved that the only uniform and isotropic space (by then it had become recognized that our galaxy was not alone in the universe, and that space was largely uniform in all directions with galaxies everywhere—an estimated 400 billion in our observable universe) was the expanding big bang described by Friedmann and Lemaître. After that, the big bang became the preferred theoretical cosmological model, although it actually took another thirty years—after Feynman began his work—before the actual physical signatures resulting from a Big Bang were seriously explored, and the discovery of the cosmic microwave background radiation put it on an unequivocal empirical footing.
While it took two decades to sort out the cosmological implications of general relativity, the most familiar of all situations associated with gravity, the gravitational collapse of a spherical shell of matter, remained confused for far longer, and is still not fully understood.
Within a few months of Einstein’s development of general relativity, the German physicist Karl Schwarzschild wrote down the exact and correct solution describing the nature of space and the resulting gravitational field outside of a spherical mass distribution. However, the equations produced infinite results at a finite radius from the center of the distribution. This radius is now called the
Schwarzschild radius
. At the time, it was not understood what this infinity meant, whether it was simply a mathematical artifact or reflected some new physical phenomena taking place at this scale.
The famous (and eventually Nobel Prize–winning) Indian scientist Subrahmanyan Chandrasekhar considered the collapse of real objects like stars and argued that for stars larger than about 1.44 times the mass of our sun, no known force could stop their collapse down to this radius or smaller. The famous astrophysicist Arthur Eddington, however, whose own observations of a total eclipse in 1919 had provided the first experimental validation of general relativity (and catapulted Einstein to world fame), violently disagreed with this result and ridiculed it.
Ultimately Robert Oppenheimer demonstrated that Chandrasekhar’s result was correct after a slight refinement (about 3 solar masses or greater). But the question still remained, What happened when stars collapsed down to the Schwarzschild radius? One of the strangest apparent implications of the theory was that from the point of view of an outside observer, as massive objects collapsed, time would appear to slow down as the Schwarzschild radius was approached, so that the objects would seem to “freeze” at this point before they could collapse further, leading to the name
frozen stars
for such objects.