Modern Times: The World From the Twenties to the Nineties (151 page)

Read Modern Times: The World From the Twenties to the Nineties Online

Authors: Paul Johnson

Tags: #History, #World, #20th Century

BOOK: Modern Times: The World From the Twenties to the Nineties
8.3Mb size Format: txt, pdf, ePub

These barbarous events reinforced fears that mankind, in its anxiety to acquire higher living standards by exploiting the earth’s natural resources, was damaging the planet irreparably. The ecological fears of the 1980s and early 1990s were in some ways similar to the panics of the 1970s, in which the world was warned it was running out of key raw materials; that is, they were both marked by emotionalism masquerading as science, gross exaggeration and reckless (even dishonest) use of statistics. Nonetheless some of the much-publicized worries had substance. There was, for instance, justified concern at the rapid destruction of the tropical rainforests, especially in Brazil, for commercial purposes. The rainforest area was calculated at about 1.6 billion hectares before deforestation by humans began in earnest in the nineteenth century. By 1987 it had been reduced to 1.1 billion, and about 80,000 square kilometres, an area the size of Austria, was being lost every year. The result was erosion of soil, floods, drought and appreciable effects on the world’s atmosphere. An additional point, beloved of ecologists but perhaps of less concern to most people, was the loss of insect species caused by deforestation. Some 30 million species of insects lived in the rainforests during the 1980s; they were being destroyed at the rate of six an hour, and 10–30 per cent of the earth’s species, it was reckoned, would be extinct by the end of the century.
144

Tropical deforestation was linked to a problem which, in the later 1980s, came increasingly to be seen as serious not only by ecological pressure groups but by science and government: the ‘greenhouse effect’. The earth’s ozone layer, which keeps out harmful ultra-violet radiation from the sun, was being progressively weakened, it was argued, by a number of factors, chiefly the burning of fossil fuels, producing carbon dioxide which acted like the glass of a
greenhouse, trapping the sun’s heat; and by the growing use of chlorofluorocarbons, used for instance as propellants in aerosols and in refrigeration and air-conditioning. Sweden had passed a law banning aerosol sprays as long ago as January 1978; but then Sweden passed many laws banning supposed harmful human activities. The first serious and documented warning about the ozone layer came in March 1984 from a team at the University of East Anglia. The ‘greenhouse effect’ was calculated to produce warmer summers, milder winters, but also violent storms, floods and drought. British people in particular began to believe there was some truth in it during the 1980s, which produced some of the warmest summers on record and, on 16 October 1987, the most violent hurricane since the early eighteenth century, which destroyed millions of trees, including many prize specimens at Kew Gardens. The month before, seventy nations, meeting in Montreal, had agreed (16 September) on a programme of measures to freeze chlorofluorocarbon emissions at existing levels and reduce them by 50 per cent before 1999. By the early 1990s the world was slowly waking up to its responsibilities as a preserver of the planet, as well as its exploiter.
145

Yet if the industrial use of technology, such as the immense machines which were tearing down the Brazilian rainforests, could damage the earth, technological advances, including sophisticated monitoring systems, could help to preserve it, by telling us exactly what was happening and what we were doing wrong. In any case, there was no halting the march of science and its application, which proceeded at an ever-accelerating pace throughout the twentieth century, both assisting man in his barbarism and reducing its worst consequences. The winning of the Gulf War by high-technology weapons, thus reducing casualties (at least on the Allied side) to a minimum, was both an exemplar and a pointer to the future. In purely physical terms, the exact sciences fulfilled all their promises in the twentieth century. Modern times, in earlier phases, were dominated by physics, especially nuclear physics and astro-physics. The physicist carried man to the brink of the pit but then halted him and bade him look down. It may be that, after the seeming inevitability of two world wars, the creation of nuclear weapons was an admonitory gift, which spared us a third clash of great nations and introduced what had become, by the early 1990s, the longest period of general peace ever recorded. The end of the Cold War, too, and the partial reconciliation of the two leading thermonuclear powers, suggested that they would be prepared to take joint steps to prevent the spread of such weapons to states foolish enough to use them. In this sense physics seems to have
served an important political purpose in the second half of the century.

But physics seemed to have come to the end of its paramountcy during the 1960s. In any case, it could not tell people what they increasingly demanded to know: what had gone wrong with humanity. Why had the promise of the nineteenth century been dashed? Why had much of the twentieth century turned into an age of horror or, as some would say, evil? The social sciences, which claimed such questions as their province, could not provide the answer. Nor was this surprising: they were part, and a very important part, of the problem. Economics, sociology, psychology and other inexact sciences – scarcely sciences at all in the light of modern experience – had constructed the juggernaut of social engineering, which had crushed beneath it so many lives and so much wealth. The tragedy was that the social sciences only began to fall into disfavour in the 1970s, after they had benefited from the great afflatus of higher education. The effect of the social science fallacy would therefore still be felt until the turn of the century.

Indeed, in the early 1990s, social scientists at Western universities, including some with high, if falling, reputations, were still trying to practise social engineering. At Oxford, and to a lesser extent at Cambridge, for instance, some colleges pursued a policy of discriminating, in their admissions procedures, against high-performing boys and girls from fee-paying schools, in favour of lower-performing applicants from state schools.
146
The object was the purely social and non-academic one of correcting supposed ‘social and financial imbalances’ in the general population. The consequence, however, was simply a lowering of standards. But standards themselves came under attack. One senior academic at the University of Pennsylvania, who opposed the whole idea of a hierarchy of merit in literature and the arts, and who wrote that distinguishing between the work of Virginia Woolf and Pearl Buck was ‘no different from choosing between a hoagy and a pizza’, declared publicly that he was ‘one whose career is dedicated to the day when we have a disappearance of those standards’. The fact that he was elected to be the 1992 president of the Modern Languages Association of America demonstrated the power of deconstructionists, as they were called, in academia.
147
But if, as deconstructionists maintained, ‘hierarchical’ systems of judgement, which favoured the study of Shakespeare’s plays over, say, comic books, were a source of social evil, what was the point of universities, whose traditional purpose was the pursuit of excellence?

Some universities now argued that the function of the campus was to correct social abuses. At Harvard, Yale, Stanford and else-where,
social engineering operated in a variety of ways. While it was difficult to expel students for organizing violent demonstrations on behalf of approved causes, or indeed for doing no academic work at all, it was comparatively easy to extrude them summarily for offending against the code of liberal censorship by using words condemned by organized pressure groups. At Smith, once one of the best women’s colleges in the world, forbidden activities included not merely racism, sexism, ‘ageism’, heterosexism and other narrowly-defined antisocial evils, but ‘lookism’, said to ‘oppress’ ugly people by ‘supposing a standard for beauty and attractiveness’. A visiting professor at Harvard Law School, once the best law school in the world, committed the particularly heinous crime of ‘sexism’ by quoting Byron’s famous line, ‘And whispering I will ne’er consent – consented’. In 1991 Stanford was reported to be working on a ‘speech code’, in which such words as ‘girls’ and ‘ladies’ were forbidden as ‘sexist’; instead of ‘girl’, the term ‘pre-woman’ had to be employed, though on this point there was some disagreement, since some female pressure groups insisted the word ‘woman’ should be spelt ‘womyn’, and others ‘wimman’.
148
Significantly, just as in Marxist states social engineering went hand in hand with financial corruption of the most blatant kind, the same conjunction appeared in ‘progressive’ American universities. Early in 1991, the House of Representatives’ Energy and Commerce Committee, under the chairmanship of John Dingell, began a vigorous investigation into the use of $9.2 billion a year funded to American universities by the federal government in the form of research contracts. They discovered that at Stanford, which had received $1.8 billion during the previous ten years, about $200 million had been syphoned off into unjustifiable expenditure, designed chiefly to give the academic staff, from the university’s president downwards, a higher standard of living.
149
Such scandals contributed to the process which, by the early 1990s, had begun to undermine the standing of the universities in general, and the social sciences in particular, among the public.

But if physics seemed to have entered, by comparison with its triumphs in the first half of the century, a period of relative quiescence, and if the social sciences were discredited, a new era of biology began from the 1950s onwards. Hitherto, the exact sciences had been able to tell us far too little about life, as opposed to matter. By the 1950s, the way in which the non-organic world operated was generally known; what began to mature in the next thirty years was knowledge of the laws of life. Such law-systems proved to be unitary and holistic. Just as Einstein’s recasting of the laws of physics applied both in the ordering of gigantic stellar congregations
and in the minute structures of subatomic particles, so the evolving biological rules applied over the whole spectrum of living matter, from the smallest to the greatest.

In the mid-nineteenth century, Charles Darwin’s theory of evolution for the first time provided a scientific organizing principle to explain why plants and animals developed the characteristics they exhibit. It was not a deductive system, permitting the prediction of future developments or even the reconstruction of the past: in this sense it was unlike Newton’s laws or Einstein’s modifications of them. Darwin himself always stressed the limits of his discoveries. He discouraged those who sought to build ambitious projections on them. That was why he gave no licence to the theories of the ‘social Darwinists’, which terminated in Hitler’s Holocaust, and why he likewise brushed off Marx’s attempts to appropriate Darwinism for his own theories of social determinism, which eventually produced the mass murders of Stalin, Mao Tse-tung and Pol Pot. In the second half of the twentieth century, however, there were at last signs of a unified theory emerging from the laboratory and reaching to both ends of the spectrum.

At the microcosmic end, molecular biology, neurophysiology, endocrinology and other new disciplines began to explain such processes as the mechanism of genetic inheritance and programming. The most important of the micro-level discoveries came at Cambridge University in 1953 when James Watson and Francis Crick succeeded in deciphering the double-helix configuration of the molecule of deoxyribonucleic acid (
DNA
).
150
They found that molecules of
DNA
, which determine the structure and function of every living animal or plant, were in the shape of a double coil, like a spiral ladder, built up of sugars and phosphates and formed into rungs containing various acids. The structure, like a magnificently complex, living computer, constitutes the particular code telling the cell what protein to make, the heart of the creative operation.
151
More striking still was the speed with which this discovery was given a multitude of practical applications. The gap between the theoretical basis of nuclear physics and actual nuclear power was half a century. In the new biology the gap was less than twenty years. In 1972 scientists in California discovered ‘restriction enzymes’, which allowed the
DNA
to be split in highly specific ways and then recombined or spliced for a particular purpose. The recombinant
DNA
was put back into its cell or bacterium and, operating according to normal biological principles, divided and reduplicated itself to form new protein material. The man-made micro-organism was then fed with nutrients and fermented by procedures in use by the pharmaceutical industry for half a century in the production of antibiotics.
152

Once DNA had been explored, the formidable resources of modern commercial chemistry had no difficulty in devising a range of products for immediate use. The process of mass production and marketing began in June 1980, when the US Supreme Court, in a historic decision, granted the protection of the patent law to man-made organisms. Earlier fears of ‘Frankenstein monster viruses’ being secretly developed and then ‘escaping’ from laboratories quickly evaporated. In America, where gene-splicing was concentrated, the restrictive regulatory structure on DNA research was replaced, in September 1981, by a voluntary code.
153
In the late 1970s less than a score of laboratories and firms specialized in splicing. By the early 1990s there were many thousands. With its immediate and multiplying applications over animal and vegetable food production, energy and, above all, medical science and pharmaceutical products, the new industrial biology promised to be a primary dynamic of the last years of the century.

The speed with which the DNA discovery was developed and applied to practical problems raised questions about the macroscopic end of the biological spectrum: the process of explaining the evolution of social behaviour in terms of the growth and age-structure of whole animal populations, humanity included, and in terms of their genetic constitution. Granted the unitary nature of biological laws, if a scientific revolution could occur at one end of the range, was it not to be expected (or feared) at the other? It was in this area that the social sciences had most conspicuously failed, not least because they had been penetrated by Marxist superstition. The academic imperialism of some social scientists prevented much serious work being done on the lines Darwin’s discoveries had suggested: that minds and mental attitudes evolved like bodies, and that behaviour could be studied like other organic properties, by means of comparative genealogies and evolutionary analysis. Such approaches were, quite irrationally, discredited by the weird racist eugenics which the inter-war fascists (and, in the 1920s, the Communists also) believed and practised.

Other books

Seeker by Jack McDevitt
The Vienna Melody by Ernst Lothar, Elizabeth Reynolds Hapgood
Change of Heart by Nicole Jacquelyn
Writing the Novel by Lawrence Block, Block
Swimming Home by Deborah Levy
Shadow of Night by Deborah Harkness