Those Who Have Borne the Battle (16 page)

BOOK: Those Who Have Borne the Battle
11.73Mb size Format: txt, pdf, ePub
This label obviously reflects more of a comparative than an absolute description. The Second World War has proved to be a better war than others in the American memory, in terms of common recognition of its cause, general public understanding and acceptance of its objectives, absence of ambivalence about enemies, and an unambiguous perception of courage, sacrifice, and accomplishments of the US armed forces—and of the committed sacrifices of those on the home front. But, of course, history is generally more complicated than the memory we carry.
World War II affirmed in many basic ways the revolutionary ideal of the “citizen soldier.” The nineteenth-century narrative that celebrated citizens taking up arms to defend the Republic seemed validated in this war. Civilians in huge numbers, perhaps as high as 12 percent of the
population, put on uniforms and took up arms for the duration. Many fought bravely and well. Yet the experience nonetheless was far removed from the casual militia service of history's memory.
In the Second World War, well-trained and well-equipped citizens served, sometimes for years, in uniform far from home. If they represented a more significant proportion of the population than any previous wars, they were still a minority. If this war was about defending the country and its values, it was also about completing a task that was defined by its participants more in practical than patriotic terms. And the celebrated commitment to American values in this “good” war was undercut by remarkably persistent racism, reflected ironically in the military force engaged in a global defense of “freedom.”
The Second World War also culminated a century and a half of discussion about the obligations of American democracy to those who served to defend that democracy. There were no longer any political advocates of the view that since serving to defend the Republic was an obligation of citizens, there should be no expectation of special gratuity or treatment. It was hard to sustain the rhetoric of everyman's obligation when, even in this massive mobilization, most Americans did not serve. The postwar support for the World War II veterans and their families was unprecedented in American history and established the template for all of the wars and benefits that would follow.
The war itself became a template for how and why Americans would fight. As such it would influence actions and certainly perceptions of the wars that would follow. This war would prove a heavy political, cultural, and military burden for the next generations. As a result, it deserves some careful consideration. Wars are complicated things. They demand tremendous sacrifice. The memory of war should always include a recognition of the human cost. These costs are borne not only by those who perish in battle, but also by those who survive it.
 
 
The Japanese attacked US military installations in Hawaii on December 7, 1941. This action proved the final incontrovertible reason for American entry into the war. The
New York Times
captured the common feeling:
“The United States has been attacked. The United States is in danger. Let every patriot take his stand on the bastions of democracy. We go into battle in defense of our own land, of our present and our future, of all that we are and all that we still hope to be, of a way of life which we have made for ourselves on free and independent soil, the only way of life which we believe to be worth living.”
President Franklin Roosevelt described December 7, 1941, and the actions that took place on that day, as “a date which will live in infamy.” He requested, and Congress provided, a declaration of war against Japan, and when Germany and Italy joined Japan's war against the United States, Congress declared war against those other Axis powers. By the second week of December 1941, the war that had engaged much of the rest of the world since the 1930s had a new and major participant.
US government agencies advertised the war as a fight against evil in defense of the country—and a commitment to make the world a better place. Aggressors started it and we would finish it.
2
There is little doubt that this view of the American role informed the public understanding of the war, but it did not define the war's language and symbols. There was little of the “save the Union” drumbeat of the North in the Civil War or the idealistic “make the world safe for democracy” zeal of World War I. Despite the heroic rhetoric and self-image—and contrary to some of the memories of noncombatants that would follow the war—this would prove a complex and difficult war, one that would inflict some very heavy costs. More than 16 million Americans would serve in the military during the war, including a half-million women. More than 400,000 of these servicemen and women would die. The nuclear bombing of Hiroshima and Nagasaki in August 1945 ended a war and announced a new era. World War II would forever alter the place of America in the world.
America's mood during the war was more practical than heroic. But there was a sense of drama and an enduring memory of heroism that was based on a remarkable set of experiences marked by true courage and sacrifice—at places like El Guettar, Salerno, Monte Cassino, Normandy, Bastogne, Guadalcanal, Iwo Jima, Okinawa, the Coral Sea, Leyte Gulf, and the Philippine Sea and in B-24s over Germany, in submarine warfare, on aircraft carriers, and in tanks. All of these mark historic battles with a
significant series of technological innovations. Each battle, each innovation, incurred heavy casualties.
The American public had been slowly, reluctantly, coming to understand the likelihood of war prior to Pearl Harbor. In the 1930s Congress and the administration tried to build barriers to prevent another war. A series of neutrality acts had imposed heavy restrictions on the sale of US goods to belligerents. On the other hand, following the fall of France to the Germans in June 1940, increasingly Americans understood that involvement was perhaps inevitable. Public opinion surveys indicated that the dominant American view was supportive of the Allies—but support did not readily translate into approval of American military engagement.
Yet despite a growing recognition of the inevitability, if not the necessity, of American involvement in war, politicians treaded softly. Europe had been at war for two years and East Asia for even longer when the Japanese attacked Pearl Harbor, but even so, the United States was not ready. In 1940 and 1941 troops trained while wearing World War I uniforms, carrying wooden guns, and riding in trucks that had signs on them proclaiming them to be “tanks.” The distinguished military historian Russell Weigley wrote of the US Army, “The historic preoccupation of the Army's thought in peacetime has been the manpower question: how, in an unmilitary nation, to muster adequate numbers of capable soldiers quickly should war occur.”
3
Unlike any previous American war, the government began to muster the troops for World War II prior to American entry into the war. In September 1940 Congress approved the Roosevelt administration's request for a draft law—an immediate peacetime draft. When he signed the legislation, Roosevelt invoked the historical view of the citizen soldier, claiming that the action “has broadened and enriched our basic concepts of citizenship. Besides the clear and equal opportunities, we have set forth the underlying other duties, obligations and responsibilities of equal service.”
4
It was an unprecedented move, audacious even, in an election year in which Roosevelt was standing for a third term. But the American public had warmed to the idea, given the challenges of the war already being fought. In 1940 89 percent of the public thought the draft was a good idea—up from 35 percent a year earlier.
The 1940 draft law provided for up to nine hundred thousand men to be drafted. Every male in the country, including foreign-born residents, between the ages of twenty-one and thirty-six had to register. All were subject to being called up for one year of service and for ten years of reserve duty. Congress restricted their service by prohibiting these draftees from serving outside the Western Hemisphere except in US possessions. There were few categorical deferments, and the legislation provided for local draft boards to determine individual exemptions.
The first draft lottery was held on October 29, and Secretary of War Henry Stimson drew the first capsule. In a moment heavy with historical symbolism, he was blindfolded with a cloth that was on a chair used at the signing of the Declaration of Independence. The capsules had been stirred with a piece of wood from Independence Hall.
5
Those responsible for overseeing the peacetime draft struggled to develop administrative rules and procedures. In the final accounting, they were not always able to meet the War Department manpower goals. Local draft boards were not consistent from jurisdiction to jurisdiction in allowing deferments. Married men with dependents were generally eligible for an exemption—and for many boards this meant that if a man's wife worked, she was not dependent and he was not deferred. In the early months there was a significant increase in marriages among men who were age eligible for the draft.
College officials pushed hard for educational deferments, but Congress and the Roosevelt administration resisted this. As Lieutenant Colonel Lewis Hershey, the deputy director of the Selective Service, asked, “Is the college student, per se, of more importance than the automobile mechanic or farm laborer who is now working and producing?”
6
Following the declaration of war, Congress, with little debate, lowered the induction age to twenty, ended the restrictions on draftees serving overseas, and provided that all who were inducted were liable to serve for the duration of the war plus six months. In the fall of 1942 Congress authorized the drafting of eighteen-year-olds. This was partially in response to the desire to reduce instances of calling up married men with children—and also in response to the army's experience that the youngest were more physically fit and more willing to take combat assignments.
The country needed to mobilize more than soldiers for this war. Prior to the declaration of war, the tremendous industrial capacity of the United States was underutilized as a result of plant closings and layoffs from the Great Depression. Wartime production picked up more rapidly than many expected it could. In 1940 the government had spent $1.8 billion on the military. In 1942 the United States spent $22.9 billion. By war's end the United States was equipping and arming not just its own substantial military force but also was providing significant support for its allies.
7
From the fall of 1940 through the summer of 1942, the army built forty-two new bases or camps. They inducted 14,000 men per day by the summer of 1942 and strained to provide facilities and training programs. From October 1940 to March 1947 when the World War II draft expired, the Selective Service registered 49 million men, selected 19 million for conscription, and saw 10 million inducted. George Gallup, whose polls indicated that public approval for the draft never dropped below 75 percent, concluded, “Few programs in the nation's history have ever received such widespread favorable reaction from the people as the handling of the Selective Service draft.”
8
With as many as 184,000 local draft board members, and with their authority over individual cases, there was tremendous variation in the implementation of the draft. Farmers, for example, had a greater advantage in securing occupational deferments—in 1944 some 17 percent of age-eligible farmworkers received deferrals, whereas only 9 percent of the other eligible workers did. Some farm-state senators had pushed for categorical deferments for farmers, but the administration defeated this initiative, largely by allowing even more local board discretion in providing deferrals. Deferments for workers in industry proved more complicated. And there is clear evidence that some local community draft boards used the withholding of deferrals as a tool to control labor activity or even to punish absenteeism.
Despite pressure from colleges for educational deferments, the administration resisted. On the other hand, they worked with the American Council on Education to develop campus-based training programs. The result was the Army Specialized Training Program, which ultimately enrolled more than 150,000 trainees on campuses. Roosevelt supported the
ASTP because he was influenced by college officials who insisted that depleting college enrollees through the draft would lead schools to close. But as Lieutenant General Lesley McNair complained, he needed 300,000 more men in the army, was facing a declining quality of inductees, and then, showing his frustration, said, “We are asked to send men to college!” Others agreed—it was, George Flynn concluded, a weak program that “did provide a subsidy to American education during the war.”
9
The V-12 Navy College Training Program similarly worked to meet both military needs and those of the higher-education community. Because the navy program aimed at producing officers instead of technical trainees, it proved less controversial. By the war's end, V-12 had enrolled some 125,000 navy and Marine Corps officers at 131 colleges and universities.
Late in 1942 a presidential order banned the military from recruiting volunteers from among those men who were already eligible for the draft. This forced the navy and the Marine Corps also to turn to the draft. Previously, the army had argued that they were taking away some of the best potential soldiers by their recruiting methods. In the fall of 1942, sailors were on average three years younger than the men serving in the army. Following the presidential order, the military could only recruit seventeen-year-olds who were eligible to serve but were not draft eligible. The marines and the navy quickly began recruiting from this population, resulting in their maintaining a younger average age than the army. In 1944 the average soldier was twenty-six years of age; the average sailor was twenty-three and the average marine twenty-two.
10

Other books

Whiskers & Smoke by Marian Babson
The Nosy Neighbor by Fern Michaels
The Blood Spilt by Åsa Larsson
Thr3e by Ted Dekker
A Killing Fair by Glenn Ickler
Cain’s Book by Alexander Trocchi
Lump by Robert T. Jeschonek
Music for Chameleons by Truman Capote