Read Farewell to Reality Online
Authors: Jim Baggott
Although of low mass, axions can account for dark matter provided there are enough of them. Once again, calculations seemed to suggest that, if they exist, axions would have been produced in abundance in the early universe and would be prevalent today.
The Axion Dark Matter Experiment (ADMX), at the University of Washington's Center for Experimental Nuclear Physics and Astrophysics, is looking for evidence of dark matter axions interacting with the strong magnetic field generated by a superconducting magnet. In such interactions, the axions are predicted to decay into microwave photons, which can be detected.
Experiments to date have served to exclude one kind of strongly interacting axion in the mass range 1.9â3.5 millionths of an electron volt. In the next ten years the collaboration hopes either to find or exclude a weakly interacting axion in the mass range 2â20 millionths of an electron volt.
The last dark matter candidate we will consider here is the primordial black hole. Most readers will already know that a black hole is formed when a large star collapses. Its mass becomes so concentrated that not even light can escape the pull of its gravity. Astronomers have inferred the existence of two types of black hole, those with a mass around ten times the mass of the sun (10
M
) and super-massive black holes with masses between a million and ten billion
M
that reside at the centres of galaxies.
Primordial black holes are not formed this way. It is thought that they might have been created in the early moments of the big bang, when wild fluctuations in the density of matter might have tripped over the threshold for black hole formation. Unlike black holes formed by the collapse of stars, primordial black holes would be small, with masses similar to those of asteroids, or about a ten billionth of
M
. To all intents and purposes, they would behave like massive particles.
Options for detecting them are limited, however. In 1974, Stephen Hawking published a paper suggesting that, contrary to prevailing opinion, large black holes might actually
emit
radiation as a result of quantum fluctuations at the black hole's event horizon, the point of no return beyond which nothing â matter or light â can escape. This came to be known as Hawking radiation. Its emission causes the black hole to lose mass and eventually evaporate in a small explosion (at least, small by astronomical standards).
If primordial black holes were formed in the early universe, and if they emit Hawking radiation, then we may be able to identify them through telltale explosions in the dark matter halo surrounding our own Milky Way galaxy. One of the many tasks of the Fermi Gammaray Space Telescope, launched by NASA on 11 June 2008, is to look for exploding primordial black holes based on their expected âsignature' bursts of gamma rays.
So, lots of ideas and lots of searching. But no evidence one way or another, yet.
The catastrophe of the vacuum
We saw in the last chapter that a series of astronomical observations are now lined up behind the existence of dark energy, manifested in the ACDM model of big bang cosmology as a cosmological constant contributing an energy density equivalent to Ω of 0.73.
At first, this seems quite baffling. It requires âempty' spacetime to possess energy (the eponymous âdark' energy) which acts like a kind of antigravity, pushing space apart and accelerating the expansion of the universe.
But wait. Haven't we already come across something like this before? The operation of Heisenberg's uncertainty principle on the vacuum of âempty' space means that it is, in fact, filled with quantum fluctuations. And the existence of these fluctuations is demonstrated by the experiment which measured the Casimir force between two parallel plates.
The existence of vacuum fluctuations means that there is what physicists call a nonâzero vacuum expectation value (a kind of fancy average value) for the energy of the vacuum. This sounds perfect. Surely the cosmological constant reflects the basic quantum uncertainty of spacetime. Wouldn't that be poetic?
Not so fast.
The size of the cosmological constant required by the ACDM model suggests a density of vacuum energy of the order of a millionth of a billionth (10
-15
) of a joule per cubic centimetre. Now, you may not be entirely familiar with the joule as a unit of energy, so let's try to put this figure into some sort of perspective.
Kellogg's cornflakes contain about 1.6 million joules of energy per 100 grams.
*
A typical box contains 450 grams and has dimensions 40 Ã 30 Ã 5 centimetres. In an upright box the cornflakes tend to shake down and occupy only a proportion â let's say 75 per cent â of the volume inside. From this information we can estimate a chemical energy density of a box of cornflakes of about 1,600 joules per cubic centimetre. This gives us a sense of the scale of energy densities in âeveryday' life.
Although we don't normally include it in our daily nutritional considerations, cornflakes also contain energy in the form of mass. We can use Einstein's formula E = mc
2
to calculate that a 450 gram box is equivalent to about 40 million billion joules. This gives a mass-energy density of about 7 trillion (7 Ã 10
12
) joules per cubic centimetre. There's not much to be gained by adding the chemical energy density to this figure to give a total.
So, perhaps not altogether surprisingly, the energy density of the vacuum is about 10
-27
times the energy density of an everyday object like a box of cornflakes. Hey, it might not be completely empty, but it's still a âvacuum', after all.
What does quantum theory predict? Well, the calculation is a little problematic. On the one hand, the uncertainty principle appears to impose some fairly rigid restrictions on what can't happen in the quantum world. However, on the other hand, it is extraordinarily liberal regarding what
can
happen. And history has shown that when quantum theory says that something can happen in principle, then this something generally tends to happen in practice.
The trouble with the uncertainty principle is that it doesn't care about the size (in energy terms) of quantum fluctuations in the vacuum provided they happen on timescales consistent with the principle. It simply demands a trade-off between energy and time. So, a fluctuation of near-infinite energy is perfectly acceptable provided it happens in an infinitessimally short time.
I think you can probably guess where this leads. Accumulating all the quantum fluctuations that are âallowed' by the uncertainty principle results in an infinite vacuum energy density.
This is vaguely reminiscent of the problem we encountered in quantum electrodynamics, in which an electron interacts with its own self-generated electromagnetic field, resulting in an infinite contribution to the electron mass. That problem was resolved using the technique of renormalization, effectively subtracting infinity from infinity to give a finite, renormalized result. Unfortunately, renormalizing the vacuum is a non-trivial problem.
Theorists have made some headway by âregularizing' the calculation. In essence, this involves applying an arbitrary cut-off, simply deleting from the equations all the terms relating to the highest-energy fluctuations. These occur with dimensions and within timescales where in any case the theorists are no longer confident about the validity of quantum theory. This is the Planck scale, the domain of gravity.
Now it's certainly true that regularizing the calculation in this way does improve things. The vacuum energy density is no longer predicted to be infinite (yay!). Instead, it's predicted to have a value of the order of 100,000 googol
*
(10
105
) joules per cubic centimetre. In case you've forgotten already, the âobserved' value is 10
-15
joules per cubic centimetre, so the theoretical prediction is out by a staggering hundred billion billion googol (10
120
).
That's got to be the worst theoretical prediction in the history of science.
It's perhaps rather amusing to note that after all the wrangling over whether or not there
is
a cosmological constant, quantum theory is actually quite clear and unambiguous on this question. There definitely
should
be a cosmological constant â quantum theory
demands
it. But now we need to understand just how it can be so
small.
The long and winding road to quantum gravity
Within a few short months of his final lecture on general relativity to the Prussian Academy of Sciences, Einstein was back at the Academy explaining that his new theory might need to be modified:
Due to electron motion inside the atom, the latter should radiate gravitational, as well as electromagnetic energy, if only a negligible amount. Since nothing like this should happen in nature, the quantum theory should, it seems, modify not only Maxwell's electrodynamics but also the new theory of gravitation.
10
In other words, Einstein was hinting that there should be a quantum theory of gravity.
There are a number of different ways physicists can try to construct such a theory. They can try to impose quantum rules on general relativity in a process called âcanonical quantization'. This is generally known as the canonical approach to quantum gravity. Einstein himself tended to dismiss this approach as âchildish'.
Alternatively, they can start with relativistic quantum field theory and try to make it generally covariant, meaning that the physical (quantum) laws described by the theory are independent of any arbitrary change in co-ordinate system, as demanded by general relativity. This is known as the covariant approach to quantum gravity. In a quantum field theory, gravity is described in much the same way as forces in the standard model of particle physics, in terms of the exchange of a force carrier â called the graviton â between gravitating objects.
A third approach is to start over.
Whichever approach we take, we run into a series of profound problems right at the outset. General relativity is about the motions of large-scale bodies such as planets, stars, solar systems, galaxies and the entire universe within a fourâdimensional spacetime. In general relativity these motions are described by Einstein's gravitational field equations. These are complex equations because the mass they consider distorts the geometry of spacetime around it and the geometry of the spacetime around it governs how the mass moves.
But spacetime itself is contained entirely within the structure of general relativity â it is a fundamental variable of the theory. The theory itself constructs the framework within which mass moves and things happen. In this sense the theory is âbackground independent': it does not presuppose the existence of a background framework to which the motions of large masses are to be referred.
Quantum theory, in contrast, presumes precisely this. It is âbackground dependent', requiring an almost classical Newtonian
container of space and time within which the wavefunctions of quantum particles can evolve.