They were, ultimately, two sides of the same coin. On the one hand, Reagan had acquiesced in the deployment of American military troops without the possibility of victory—something he swore not to do after Vietnam. Reagan later cited the Beirut bombing as the cause of the Weinberger Doctrine, later renamed the Powell Doctrine or the Reagan Doctrine, depending on who wished credit for it. The principles, as Reagan outlined them in
An American Life
, were as follows:
1. The United States should not commit its forces to military action overseas unless the cause is vital to our national interest.
2. If the decision is made to commit our forces to combat abroad, it must be done with the clear intent and support needed to
win
. It should not be a halfway or tentative commitment, and there must be clearly defined and realistic objectives.
3. Before we commit our troops to combat, there must be reasonable assurance that the cause we are fighting for and the actions we take will have the support of the American people and Congress. . . .
4. Even after all these other tests are met, our troops should be committed to combat abroad
only
as a last resort, when no other choice is available.
104
George H. W. Bush would follow these maxims during Operations Desert Shield and Desert Storm in 1990-91, with Colin Powell’s approval. Yet buried within even these guidelines were several pitfalls that would entrap future presidents. President Bill Clinton would stretch to make the Bosnia and Kosovo regions “vital to our national interest.” In neither case was a true victory possible, or even desired. Bush himself had ordered an end to combat in Iraq when the collapse of Saddam Hussein’s regime seemed imminent and the troops only, in Gen. Norman Schwarzkopf ’s words, “150 miles from Baghdad.” Victory was abandoned in favor of adherence to the United Nations resolutions, violating Reagan’s second requirement.
After 9/11, President George W. Bush likewise struggled with clarifying the definition of victory, which, after all, is often simply a matter of the other side giving up (even if informally). When resistance ends, victory occurs. Without any specific leader or organization to force to a treaty table, a conflict can appear endless. It certainly must have seemed so to the Romans in their struggle against Carthage, or to the English in their century-long fight with the French. Yet in both cases, there was an end. Nor is it always possible to protect national security and still maintain the full support of Congress and/or the public, especially in an age when the media is predisposed to hate the application of American power. (During the Iraq conflict from 2003 to 2006, virtually
no
major media outlets routinely discussed American
victory
: virtually all were obsessed with American defeat, typically invoking the terms “Vietnam” or “quagmire” to describe the U.S. effort in the Middle East.)
105
Hence, the demands of keeping America secure and maintaining public or congressional support for a war may be antithetical at times: there is evidence that after Germany was defeated, American opinion was slowly but steadily turning against forcing unconditional surrender in Japan, despite the fact that Japan, not Germany, had started the war!
The Founders were absolutely unanimous in their view that the security of the United States was paramount. Thomas Jefferson, called by one author a “half-way-pacifist” and routinely hailed by modern libertarians as a “smallgovernment” leader, in fact dispatched America’s first overseas military force to subdue the Barbary pirates.
106
In doing so, did he violate his own (and George Washington’s) dictum to avoid “entangling alliances”? After all, his first action was to seek precisely such an alliance with Britain, France, Spain, and other European countries, only to be turned down. (It is ironic that George W. Bush succeeded in putting together a coalition of more than twenty nations in Iraq whereas Jefferson, lauded by liberals, could not construct a coalition with so much as a single foreign power.) Washington himself did not reject alliances, but rather was concerned that the young United States would enter an alliance before its population and military could keep it from being a pawn. During his tenure as commander of the Continental Army, then as president, Washington had in fact engaged in a number of “entangling alliances,” most of them nonaggression pacts with Indian tribes. American leaders all desperately sought an “entangling alliance” with France during the Revolution.
As president, Jefferson was not reluctant to fight Britain: he was certainly a Francophile, but he understood that the American navy was not yet a match for the British, and that any declaration of war would invite an invasion . . . which it did. To a man, the Founders would have applauded Reagan’s 1986 air strike on Libya (in retaliation for Libya’s role in the bombing of a German disco that killed two Americans, and which was only the latest in a string of terror strikes launched from Libya). At the same time, the Founders would have winced at Reagan’s willingness to participate in the Lebanon “peacekeeping” effort—just as they would have been dismayed by the similar actions of Truman, Eisenhower, Kennedy, Johnson, Nixon, Ford, and Carter.
What would George Washington, John Adams, and Thomas Jefferson make of militant Islam, however? It’s difficult to say: the closest proxy we have for such a radical movement in their era is the French Revolution, which is to say, a poor proxy indeed. Although ideologically driven, the French Revolution was secular. It would be two hundred years before the fruits of such a secular influence in political life were felt. Nothing in the French Revolution came close to the jihadist dogma of forcing submission to Islam at gunpoint; nor had anyone in America or France witnessed anything close to “suicide bombers” who randomly targeted civilians. Even for the French, who seldom paled at the sight of bloodshed, this would have constituted an outrage against the “rights of man and the citizen.” In the case of the Barbary pirates, however, whose actions did constitute the terrorism of the day, Jefferson’s response was quick, substantial, and sharp. He sent the entire U.S. Navy to crush all the Barbary States, not just Tripoli (the only one to declare war on the United States).
In terms of employing military force, Reagan learned the lessons of Beirut. But in terms of appreciating the symbolism of the subsequent withdrawal of the troops on the jihadist mindset, Reagan committed an even greater mistake than when he sent in the Marines in the first place. Bud McFarlane, once a supporter of the deployment, admitted in a
New York Times
editorial in 2008 that “the most telling [conclusion about the withdrawal] was the one reached by Middle Eastern terrorists, that the United States had neither the will nor the means to respond effectively to a terrorist attack. . . .”
107
McFarlane’s revelation was hardly new in light of Islamic thinking. One only had to look at a Middle Eastern paper,
An-Nahar
in April 1982, which predicted that America’s failure to respond to the April embassy bombing would lead to new attacks.
108
That radical Islam was fundamentally antithetical to Western concepts of life and liberty—and thus also to incentives based on those values—was still not evident in the 1980s. Thus pulling the Marines out sent yet another message in an all-too-common string of signals that Americans “won’t stick.”
Our commitment to the value of life became a weakness in the eyes of jihadists, not a strength, as made evident in Osama bin Laden’s 1996 comment: “We have seen in the last decade the decline of American power and the weakness of the American soldier who is ready to wage Cold Wars, but unprepared to fight long wars. This was proven in Beirut in 1983 when the Marines fled after two explosions.”
109
Over time, the “two explosions” in Beirut would become a recurring theme for bin Laden: in 1998, he sat down for an interview with reporter John Miller for an article in
Esquire,
in which he called America a “paper tiger” that after “a few blows would run in defeat.”
110
Reagan’s biggest mistake lay not only in committing the Marines to Lebanon under conditions in which they could scarcely defend themselves, but also in confirming in the minds of the Islamic radicals that the United States lacked resolve by withdrawing them. For a man whose own steadfastness and insight into the Soviet mind ended the ever-present threat of nuclear holocaust, it was an uncharacteristic misjudgment. And, even though he began to correct it almost immediately, America is still fighting the same brand of terrorism today.
7.
BARRY MAKES A SPEECH . . . AND THE MEDIA GETS CHILLS UP ITS LEG
No government ought to be without censors, and, where the press is free, no one ever will.
THOMAS JEFFERSON TO GEORGE WASHINGTON, 1792
A
t one time or another, any American over the age of forty thought, or was told, that the news media was “objective,” “fair,” and “balanced” in its reportage of the news. Older Americans remember a time—dominated by the Big Three (ABC, CBS, NBC) broadcast television networks—when one actually could not discern the political persuasion of nightly news anchors, or even most of the reporters. Occasionally, a star like Edward R. Murrow would take on a controversial story—his most famous being the antics of Senator Joe McCarthy—and thereby reveal his political bent. Most of the time, however, the news was, well, “news,” which covered the major events of the day, beginning with those that occurred in or affected the United States, followed by, if time allowed, those that affected other parts of the world.
Fast forward to the 2008 presidential campaign and the media’s fawning coverage of Illinois senator Barack Obama. That coverage could have been called a number of things, but not, by any stretch, could it be called “fair,” “objective,” “balanced,” or even “news.” Perhaps the grossest perversion of a “news” show was commentator Chris Matthews’s statement that he got a “chill” up his leg when he heard Obama give a speech. At the same time, the character assassination of the Republican vice presidential nominee, Alaska governor Sarah Palin, by the media was unmatched in modern history, including the coverage of Richard Nixon.
1
How did American news organizations go from vigilantly scouring out
any
hint of bias to the point that, except for Fox News (which, because it insists on airing both sides in its “fair and balanced” reporting, is viewed by all the other media as “conservative”),
all
the so-called news organizations were, as Rush Limbaugh put it, “in the tank” for Obama? In order to answer this question, we must first examine the early history of American news.
As noted in the chapter on Martin Van Buren, newspapers were certainly not born “fair and balanced” or free of bias. Early broadsides, as newspapers were called, of the Revolutionary era briefly championed or opposed the cause of separation from England (one study found that of the seven papers in Boston, four were “loyalist,” two were “patriot” papers, and one, Thomas Fleet’s
Boston Evening Post,
attempted to remain neutral during the crisis).
2
After independence, these broadsides gave way to a localized, and rather dull, set of neighborhood gossip rags dedicated to community events and occasional police blotters in which the criminal activities of locals were exposed. New York only had one copy of a daily paper for every thirty-two residents, and as one historian observed, “If
all
the newspapers published in 1790 had been evenly distributed, each American would have received just one issue that year.”
3
Van Buren’s creation of the Democratic Party, complete with its imperative to “get out the vote,” changed the local papers full of “clipped news” into full-time propaganda organs whose editors were paid party hacks who unwaveringly expressed only the views of the party that was footing the bill. As one scholar drily noted, the “press was not particularly responsive to its audience during the 1820-1860 years.”
4
While not universal by any means, cracks in the wall of partisanship had begun to appear as early as 1836, when James Gordon Bennett’s
New York Herald
introduced a new “commercial and objective” journalism, proclaiming, “We shall support no party—be the organ of no faction. . . .”
5
Bennett announced that his paper would “record facts on every public and proper subject, stripped of verbiage and coloring.”
6
A plethora of “penny presses” interested in circulations, not elections, popped up. They operated on new business models under which they tried to appeal to all subscribers.
But it was the American Civil War that brought about a “revolution in journalism.”
7
Suddenly, Americans from both the North and South wanted accurate information, not platitudes or propaganda. They needed to know if Billy Yank or Johnny Reb had been killed, who won the battles, and where the enemy armies were at any given time. (In this, the newspapers often proved far more accurate with their information than the scouts or the intelligence units of either army: as the Battle of Gettysburg loomed, Gen. Robert E. Lee learned the identity of the new commander of the Army of the Potomac from a scout who read it in northern papers.) The “home front . . . wanted unvarnished facts,” wrote one scholar, although not everyone submitted to the new demands for accuracy: publisher Wilbur Storey told his reporters, “Telegraph fully all the news you can get and when there is no news send rumors.”
8
Nor did it mean that battlefield commanders would willingly cooperate with journalists. Gen. George Meade, the Union commander at Gettysburg, grew so disgusted with the dispatches of one reporter that he tied him up and sent him riding backward on his horse out of his camp with a sign around his neck reading “Libeller of the Press.”
9