Why, though, should this be true at the quantum level and not the classical? Why don't you see an interference pattern when, say, throwing baseballs through your neighbor's front windows? According to the physicist Roger Penrose, it's simply a matter of scale. The probability distribution for the position of a given particle includes a term which, when we compare one particle path with another, can reinforce or cancel out its equivalent in the second probability distribution, thus generating the interference pattern. But when we scale up to the realm of classical physics, we are dealing with a vast number of simultaneous probability distributions: the key term effectively “averages out” to zero, leaving only the individual probability distribution for each path. Your baseballs will go through one window or the other, and there will be no interesting pattern on the far wall with which to distract your angry neighbor. “Do you really believe that?” The spirit of Ngidi is never far away in quantum mechanicsâand if you're not entirely satisfied that probability is the fundamental reality, you are not alone. Einstein, though he himself had first come up with photons as quanta of radiant energy, profoundly disliked the idea of simply agreeing that such things had no physical presence until observed. Richard Feynman blithely stated: “I think it is safe to say that no one understands quantum mechanics.” Perhaps, like position, understandingâin the sense of making a coherent inner picture of an unseen realityâceases to have meaning at certain scales. At least the equations work; or, as Feynman put it, “Shut up and calculate.”
Is there any way we can think about this without ending in gibbers and squeaks? Perhaps: there
are
situations in real life where we are aware of a field of probability separate from particular moments and positions. If you drive to work or school, you probably imagine your route as a probability field, with certain lanes between certain intersections offering a greater potential for getting past that crawling bus than others. This morning or tomorrow morning you may or may not actually
be
in the left lane as you pass the doughnut store, crowing in triumph or growling in despairâbut the route as it exists in your mind, like the two-slit arrangement, is both specific and probabilistic.
Granted, there will always be a slight whiff of medieval theology at the extreme scales of physics, a touch of
credo quia impossibile
â“I believe because it is impossible.” Let us therefore shift back into the realms of the visible and palpable by selecting a big, bluff, no-nonsense nineteenth-century example: the steam boiler.
What is the source of its power? Motion. Molecules of water vapor, hot and excited, rocket around the boiler's confined space, caroming into each other and into the sides of the vessel, thus producing pressure. But already, in the course of that sentence, we have run into the necessity of mixing individual and collective descriptionâparticles pursuing their frantic courses and pressure measured across them all. Each molecule is a perfect Newtonian agent; each collision obeys the same laws of motion as do the sudden meetings of billiard balls and linebackers. The whole system is classical and deterministic. If you wanted a model position-and-velocity universe for training Laplace's all-knowing demon, this would be it. Yet, while we could set the demon its task, we could never begin to achieve it ourselvesânot even to predict the positions of the molecules in 1 cubic millimeter of steam 1 millisecond from now. We encountered this problem when we looked at the weather: complexity imposes limits on predictability. We can set the equations, but this does not mean we can solve them.
The movement of individual molecules, buffeted by those around them, is
essentially
deterministic but
effectively
random. This means that many of the basic qualities we ascribe to physical systemsâheat, mechanical work, pressureâare impossible to define except statistically. The great nineteenth-century physicist James Clerk Maxwell considered the properties of a gas (pressure of steam, for example) in terms of a statistical distribution of qualities among its constituent molecules (in this case, their velocity) and found this distribution was the same as the normal curve. To come to this conclusion, he had to make the same kinds of assumptions about the grubby, prosaic boiler that we have been making about our various more rarified examples of probabilistic systems: that the elements are evenly distributed; that the system as a whole is in equilibrium; that each molecule has an equal probability of going in any directionâin other words, that you could represent this system by true, unchanging dice rolled fairly.
If this were all, we could still say that probability is just a way of
talking
about heat, not something intrinsic to reality. But Maxwell discovered a further imp in the boiler. Maxwell's clever demon was called into being in 1871 to point out an essential difference between what
could
happen in physical systems and what actually does. Imagine the demon as doorkeeper, guarding the pipe linking two boilers. As Maxwell had shown, the various steam particles have different energies, normally distributed around the constant, mean energy for the whole system. So when a particular molecule approaches the pipe, the demon sizes up its energy: if it is above a certain threshold, he lets the molecule throughâotherwise, he remains, arms folded, staring off into space. You can see that given enough time, this selection procedure would produce a marked difference in energy between the two boilers: all the high-energy VIP molecules enjoying a party there beyond the pipe while the low-energy majority lurk resentfully on this side. The energy for the whole system remains the same; but it is now more organized than it wasâso much so that you could use the difference in energy to do useful work. This sorting seems to create something out of nothing, making possible a thermodynamic perpetual motion machine.
Yet as Maxwell took pains to point out, such a sorting never takes place in the physical world. Mix hot and cold, you get lukewarm, which will not then separate again. Heat moves to cold, high pressure to low, energy spreads; you don't see these qualities concentrating themselves. This is the Second Law of Thermodynamics:
entropy
(that is, the proportion of energy not available to do work) tends to its maximum in any closed system. Things fall apart; the center cannot hold. Physical systems slump into the most comfortable position possible: that is, where energy gradients approach flatness.
“I offer you something quite modest, admittedly for me all that I have: myself, my entire way of thinking and feeling.” Ludwig Boltzmann's own energy gradient was always sharply up or down: he was never entirely comfortable, although he looked the archetype of a nineteenth-century Viennese professorâchubby, flowing-bearded, with weak eyes peering behind oval spectacles. He was kind and argumentative, inspired and despairing, daring and doubtful. He blamed his uneasy moods on having been born on the cusp between Mardi Gras and Ash Wednesday.
Boltzmann's conscience would not allow him to ignore the logical gap between a gas as a collection of individual molecules, colliding according to classical physical rules, and a gas as a collective described in terms of statistical properties. He set about bridging it in 1872, with his “transport equation,” a description of how a chain of collisions would, over time, distribute momentum from particle to particle so that the final result was normally distributedâin effect, operating like a complex three-dimensional version of Galton's quincunx.
Imagine it this way: every collision between a higher-velocity molecule and a lower one tends to transfer some energy from the one to the other. When my car rear-ends yours, you jerk forward and I slow down. We could start with a system composed half of high-velocity particles and half of near-stationary ones; it has low entropy in that it's very ordered. As time passes and these particles collide, the
proportion
of molecules with either high or zero velocity goes down and the proportion of those with something between high and zero velocity rises. Energy will continue to pass across at every collision, like genes at every generation, but the population as a whole maintains a constant, normal distribution. So although we cannot talk about the history and velocity of each particle, we can talk about the proportion of particles that have energies within a set of given ranges; the vast tangle of interconnected functions of motion is organized into a flight of discrete steps, just as the rich complexity of a human population can be organized by measuring chests or taking a poll.
In 1877, Boltzmann extended this idea to explain mechanically how entropy tended toward its maximum value for any given state of an isolated system, a feat he achieved by relating the overall state of the system to the sum of all its possible micro-states.
Consider a system with a given total energy. There are many different ways that energy could be distributed: equally among all particles, for example, or with all the energy vested in one hyperactive molecule while all the rest remained in chilly immobility. We can call each of these possibilities a micro-state. Each is equally probableâin the same sense that each of 36 throws is equally probable with a pair of dice. But, you'll remember, the
totals
you get from the throws are not equally probable: there are more ways of making 7, for instance, than of making 12. Let a vast room full of craps players throw dice simultaneously, and you will find a symmetrical distribution of totals around seven, for the same reason that you find a normal distribution of velocities in a gas at equilibrium: because there are proportionally more ways to achieve this distribution than, say, all boxcars on this side of the room and all snake eyes on that. The maximum entropy for a physical system is the macro-state represented by the highest proportion of its possible micro-states. It is, in the strictest sense, “what usually happens.”
Boltzmann's linking of microscopic and macroscopic showed how the countless little accidents of existence tend to a general loss of order and distinction. Things broken are not reassembled; chances lost do not return. You can't have your life to live over again, for the same reason you can't unstir your coffee.
But, objected Boltzmann's contemporaries, you
can
unstir your coffeeâat least in theory. Every interaction in classical physics is reversible: if you run the movie backward all the rules still apply. Every billiard-ball collision “works” just as correctly in reverse as it does forward. True, we live in a mostly dark, cold and empty universe, so we don't see, for instance, light concentrating from space onto a star, as opposed to radiating out from it. Yet if we
were
to see this, it would merely be surprising, not impossible. Our sense of the direction of time, our belief that every process moves irreversibly from past to future, has no clearly defined basis in the mechanics of our cosmos.
So how could Boltzmann suggest that, although time has no inherent direction at the microscopic scale, it acquires direction when one adds up all the micro-states? How could a grimy steam boiler hold a truth invisible in the heavens? The objections were formal and mathematically phrased, but you can hear in them the same outrage that warmed the proponents of Free Will when they argued against Quetelet's statistical constants.
Yet there was more than moral outrage at work: there was genuine puzzlement. Poincaré's conclusions from studying the three-body problem had included a proof that any physical system, given enough time, will return arbitrarily close to any of its previous states. This is not quite the Eternal Return with which Nietzsche used to frighten his readersâthe hopelessness to which the Hero must say “yes”âsince only the position, not the path, is repeated: this moment (or something very like it) will recur but without this moment's past or future. Even so, Poincaré's proof seems to contradict the idea of ever-increasing entropy, because it says that somedayâif you care to waitâthe system will return to its low-entropy state: the cream will eventually swirl out from the coffee.
Boltzmann, surprisingly, agreed. Yes, he said, low entropy can arise from high, but low entropy is
the same as
low probability. We can imagine the state of our system moving through the space representing all its possible states as being like an immortal, active fly trapped in a closed room. Almost every point in the room is consistent with the maximum entropy allowedâjust one or two spots in distant corners represent the system in lower entropy. In time, the fly will visit every place in the room as many times as you choose, but most points will look (in terms of their entropy) the same. The times between visits to any one, more interesting point will be enormous. Boltzmann calculated that the probability that the molecules in a gas in a sphere of radius 0.00001 centimeter will return to any given configuration is once in 3 Ã 10
57
yearsâsome 200,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 times the age of the universe so far. As comparatively vast a system as a cup of coffee would be more than cold before it spontaneously separated.
Time's arrow, then, is not an intrinsic fact of nature; it is something defined by the prevalence of the more probable over the less probable. It is part of what usually happens but need not. Nothing in physics requires that we live from past to future; it's just a statistical likelihood. Somewhere in the universe now, physics may indeed be behaving like the movies shown backward at the end of children's parties: water leaps back into buckets and cream pies peel from matrons' faces to land back on the baker's cart. But it's highly improbable. “Time and chance happeneth to them all,” says Ecclesiastesâbecause time
is
chance.
Â
Boltzmann's discoveries created the modern field of statistical mechanicsâthe general theory of which thermodynamics is the special case. It studies, as the quiet, brilliant Yale bachelor Josiah Willard Gibbs put it, how “the whole number of systems will be distributed among the various conceivable configurations and velocities at any required time, when the distribution has been given for one time.”