Evil Geniuses: The Unmaking of America: A Recent History (28 page)

BOOK: Evil Geniuses: The Unmaking of America: A Recent History
9.82Mb size Format: txt, pdf, ePub
ads

But then around 1980, under the camouflage of high inflation, private colleges started increasing their prices every year a bit
faster
than inflation. Public colleges soon followed suit, state legislatures started cutting university funding, and that vicious cycle picked up speed. In the 1990s the price of a college education ballooned even faster—especially at public institutions—and never stopped.

Since 1981 states have cut their funding of public colleges and universities by half. The real, inflation-adjusted cost of attending a four-year college has almost tripled. That undergraduate year at the University of Nebraska has gone from the equivalent of $10,000 in the 1970s to $25,000 now. The $22,000 that Harvard charged in the 1970s, which my parents could
just
scratch together, now runs $72,000 a year, all in.

Only a quarter of people graduating from four-year public colleges and universities in the early 1990s had student loan debt; by 2010, two-thirds did. Credit had been deregulated in the 1980s just in time for the business of student loans to explode in the ’90s. When I graduated college in 1976, the total amount of money lent to students to pay for higher education each year was the equivalent of $8 billion—but by the first school year of the 1980s, it had jumped to $22 billion, and in 2005 it reached $100 billion. In other words, over those three decades, while the number of students grew by half, the amount of money they borrowed each year increased twelvefold. For the financial industry, a small revenue stream turned into a great roaring river. For the 45 million mostly young and youngish Americans who today carry an average of $35,000 apiece in student debt, it’s yet another source of economic insecurity that did not exist before everything changed in the 1980s.

From the decade my parents attended college through the decade I attended college, the percentage of all Americans with four-year degrees more than tripled. But then college became terribly expensive, and that constant, rapid increase in people attending and graduating, hard evidence of the American Dream working, slowed
way
down, especially for men.

I’m fairly sure that an American college education today isn’t two or three times as good as it was when I went, even though it’s two or three times as expensive. Rather, in the 1980s
everything
in America became more of a commodity valued only by its market price, and a college degree was turned into a kind of luxury good, the way it had been back in my grandparents’ day. But it wasn’t just status anxiety that drove up the price of college in the 1980s, the decade in which Hermès started selling a certain leather handbag for ten thousand dollars apiece just because it could. A four-year degree simultaneously became an expensive luxury good
and
practically essential to a middle-class life, because the economic value of a degree also wildly increased during the 1980s and ’90s.

College graduates had always been paid more on average than people with less education. But that college premium had actually
shrunk
during the twentieth century before 1950, and even after that didn’t grow much—until 1980, when it exploded. In the early 1980s college graduates of all ages earned a third more than people who’d only graduated high school. Just a decade later, in 1992, they earned two-thirds more, as they still do. What’s worse, for people who don’t have college degrees, average real pay has gone
down
since then by 10 or 20 percent. In other words, a college degree became a more essential but also much less affordable ticket to the increasing prosperity that, until 1980, all Americans had enjoyed.

Yet in this century, there’s a bait-and-switch lose-lose-lose punchline to the story. Since 2000, with two generations of college graduates having burdened themselves with unprecedented debt to pay for the unprecedented new costs of college, the college-grad income premium basically stopped increasing. Today four out of ten recent American college graduates are employed in jobs that don’t even require a college degree. And while college graduates used to accumulate more wealth at younger ages than people without degrees, according to a 2018 Federal Reserve study, the costs of college and of student debt have now erased that wealth premium for younger college-educated Americans.

If the American Dream had one simple definition, it was that hard work led to a better life, materially and otherwise, if not for oneself then for one’s children and grandchildren. In the late 1800s, when Horatio Alger published
Ragged Dick
and his other fictional chronicles of upward economic mobility, America’s exceptionalism wasn’t just a self-flattering myth. Back then a lot more Americans than people elsewhere really did move up the ladder from generation to generation. Our edge over Britain and the rest of Europe was diminishing by the 1950s, but economic mobility remained a real thing in the United States, onward and upward—until 1980.

That change is particularly clear in a recent study conducted by Stanford and Harvard economists. In 1970, they found, almost all thirty-year-old Americans, 92 percent, were earning more than their parents had at that age and older. Among Americans in their early thirties in 2012, however, only half were earning more than their parents had—and for sons compared to fathers, even fewer. That enormous difference over two generations was mainly caused not by slower economic growth, the economists found, but by how American economic growth was shared after 1980. If we’d continued slicing the pie as we’d done from 1940 until 1980, then 80 percent of those Gen-Xers would be earning more money than their Silent Generation parents, instead of only 50 percent.

These days, if you grow up poor in America, you have less than a one-in-four shot of becoming even solidly middle class—one in three if you’re white, one in ten if you’re black. If you grow up right in the economic middle, the chances are you won’t move up at all. On the other hand, if you come from an upper-middle-class or rich household, the odds are strong you’ll remain upper middle class or rich as an adult.

When inequality started increasing in the 1980s, separating the fortunate few and the unfortunate majority, it showed up geographically as well: not only were only the rich getting richer, but neighborhoods and cities and regions segregated accordingly. The economist Enrico Moretti calls this the Great Divergence. Before the 1980s, the decade in which gated communities became common, Americans tended to live more democratically. Americans with more money and less money were likelier to live alongside one another.

In 1970 only one in seven Americans lived in a neighborhood that was distinctly richer or poorer than their metropolitan area overall, but that fraction began growing in the 1980s, and by the 2000s it was up to a third. Before the 1980s, two-thirds of Americans lived in middle-income neighborhoods; now a minority of us do, a fact that makes the terms thrown around about the middle class—
disappeared,
hollowed out—
seem less metaphorical.
*

As the American middle class quickly grew from the 1940s to the ’70s, so did economic equality—that is, the income gap between richer and poorer steadily shrank. Interestingly, that same leveling also happened at the same time among
cities,
with wages back then growing faster in poorer places than they did in more affluent ones, allowing people in the laggard cities to catch up. But around 1980 that stopped too. Since then the average salary premiums for jobs in and around economically robust cities have grown to be several times as large as they’d been in the 1970s, tens of thousands of dollars a year more per employee instead of merely thousands. After 1980 college graduates with skills started getting paid less if they lived in and around Cleveland rather than thriving Omaha, or in Stockton rather than thriving San Jose, so they moved.

This Great Divergence is yet another way in which growing economic inequality gets built into the system and becomes self-perpetuating, with residents of richer cities and regions getting even richer while their fellow citizens in unfortunate places fall further behind.

Not only do people who live in Boston or Raleigh or Austin get to choose from better jobs, their wealth also increases more because of real estate prices, which have risen more than twice as fast in cities in general as in rural areas. Superhigh prices for apartments and houses, in turn, mean that it’s harder for people from left-behind places to afford to migrate to booming urban areas, which is bad for them and probably for U.S. economic growth too. And people in the booming cities who aren’t Internet workers or their masseuses have a much harder time affording to stay. In Seattle in the 1960s, for instance, a typical janitor and a typical lawyer both spent 10 or 15 percent of their incomes to live in an apartment or house they owned or rented; today the Seattle lawyer still pays 15 percent for housing, but the Seattle janitor has to pay around 40 percent.

One of my premises in this book is that a real and mainly good expression of American exceptionalism had been our willingness and eagerness to take on the
new.
That often meant pulling up stakes and hitting the road in search of new work or a new life. To be American was to be venturesome. In the heyday of the so-called American Century, in the 1940s, people were doing that in a big way. The percentage of people who lived in a state other than the one they were born in rose steeply, and it kept rising as the country boomed and became more equal—and then it stopped rising around, yes, 1980. Since then the rate at which people move to a new state or city for a new job has fallen by half, and it is now at the lowest it’s been since the government began tracking it. People without college educations are less likely to relocate, and in just the last decade people in their early twenties have suddenly become stationary, a third unlikelier to move than in the 2000s. Is this new geographic immobility more a cause or an effect of our new economic insecurity and inequality and immobility? Like so many spiraling vicious cycles, it’s surely all of the above.

As the disappearance of factory jobs made cities like Detroit and Buffalo losers in the Great Divergence, automation and the digital revolution and globalization made certain cities big winners. Cities with tech companies and lots of college graduates were positioned to grow even faster in this postindustrial age. Unlike most of the other economic changes I’ve discussed, like shareholder supremacy and ending antitrust and defeating organized labor, this wasn’t part of the original strategy of big business and the right. But for them it isn’t exactly collateral damage either, because it has been a political boon. So far they’ve brilliantly managed to redirect the anger of most of the (white) left-behinds to keep them voting Republican, by reminding them that they should resent the spoiled college-educated liberal children and grandchildren of the acid amnesty abortion liberal elite who turned on them and their parents and grandparents in the 1970s.

*
The good news is that while neighborhoods have gotten more economically homogeneous, they’ve also become more racially and ethnically diverse. In 1980 the residents of at least a quarter of all U.S. census tracts, each a neighborhood of a few thousand people, were essentially all white and non-Hispanic. Nowadays only 5 percent of white Americans live in such neighborhoods, most of them in rural areas.

Back in the 1930s and ’40s, there had been very left-wing Democrats with serious national prominence and significant political bases. Franklin Roosevelt’s own vice president Henry Wallace was one, as most notably was Senator Huey Long of Louisiana, who introduced bills in the 1930s to enact a 100 percent income tax on earnings over the equivalent of $20 million and a wealth tax of
100 percent
on everything over $1 billion. Which made the president seem moderate when his 1935 “Soak the Rich Act” raised the tax on income over $2 million to 55 percent. “Political equality,” FDR said in 1936, is “meaningless in the face of economic inequality.” In what he pitched as a Second Bill of Rights, he proposed
guaranteeing
all Americans “the right to earn enough” for “a decent living” and “a decent home” and to have “adequate medical care” paid for with “a tax [on] all unreasonable profits, both individual and corporate.” That was 1944, Peak Leftism for Democrats on economics.

After the war, the American economic left existed in a meaningful way only within organized labor. And by the 1980s, unions were reduced to desperate parochial struggles to save jobs in declining heavy industries and, as mistrust of government grew, to unionizing more government employees. Moreover, the left offered no inspiring, politically plausible national economic vision of a future. In response to economic Reaganism, the national liberals were committed to preserving the social welfare status quo for old people and the (deserving) poor, and to convincing America that Democrats were now modern and pragmatic
,
not wasteful bleeding-heart suckers or childish protesters or comsymp fools. Very few believed anymore that
unreasonable profits
could even be a thing.

Against a triumphant Republican Party high on right-wing ideology, liberals’ new selling point was their lack of any ideology at all. The faction that was now dominant in the Democratic Party had been pushing for a more centrist economic and social welfare policy since the 1970s, but the Republican Party after 1980 had no comparable moderating faction—which in a two-party system meant that Democrats kept moving toward a center that kept moving to the right.

The traumatic Republican landslide in the presidential election of 1972 persuaded generations of Democrats that they must tack toward the center no matter what. The
don’t go left
lesson was only reinforced during the 1980s, when three quite
center
-left presidential candidates in a row lost. In the 1984 Democratic primary, Gary Hart was still the neoliberal apostate, running against the supposed mustiness of the New Deal and the Great Society
and
government
and
the Establishment. “The fault line of the party,” he said then, “is now between those who have been in office for 20 or 25 years and those who have come into office in the last 10 years, and who are less tied to the arrangements dating to [Franklin] Roosevelt.” That was because “the solutions of the thirties will not solve the problems of the eighties.”

In fact, as the political journalist and author Richard Reeves wrote in 1984 of “Democratic liberalism’s traumatic break with organized labor,” the party’s neoliberal cutting edge considered unions not just “the solution of the 1930’s” but “the
problem
of the 80’s,” an obsolete obstacle in the way of “a socially liberal, high-technology, high-growth America….We’re not going to go down with the crew. Sorry, guys!”

In other words, the New Democrat avatar Hart was telling receptive twenty-something liberal yuppies like me that it was passé to fret about big businesses getting too big or Wall Street speculating too wildly or unions and unionism being definitively defeated, and that it was folly to think of Social Security and Medicare as models for any kind of expanded social democracy. And he was effectively persuading everyone, particularly people who had blue-collar jobs and who used to vote Democratic
because
of all those arrangements dating to Roosevelt, that the two parties did not really disagree about economics.

Rather than offering a distinct programmatic vision for the political economy apart from the adjective
new,
Hart was selling sauciness and smartness and cool, cleverly exploiting the generation gap a generation after it had become a thing, just as the youngest boomers could vote and the oldest ones were about to enter middle age. He himself was approaching fifty, only eight years younger than his primary opponent, former vice president Walter Mondale, whom he nevertheless caricatured as an old fogey. Of the youth of 1984, Hart said in a perfect soft-pedal pander, “You’re dealing here with very sophisticated people. They don’t want a messiah. They don’t want me to personalize an entire generation’s yearnings—just to be the vehicle of its expression. I see this campaign as a liberating vehicle.” In his campaign stump speech, he would repeat the phrase “new generation” a dozen times.

It nearly worked. I was pleased when he ran a close second in Iowa, finished first in New Hampshire a week later, and after that won a majority of states. I remember being in a Des Moines hotel room covering the Democratic caucuses for
Time,
feeling so state-of-the-art to be filing a story about Hart, a so-called Atari Democrat, through a twenty-eight-pound portable computer connected to a shoebox-size dial-up modem to which we’d docked a curly-corded desktop telephone handset.

Just a few weeks earlier, in January 1984, Apple had introduced the Macintosh. Its famous Super Bowl ad, based on George Orwell’s
Nineteen Eighty-four,
featured a heroine smashing the tyrants’ huge telescreen, a lone nonconformist underdog spectacularly defying the oppressive Establishment. It suited the moment and digital early adopters who were politically aware but not actually, specifically political, in a stylish little allegory with which everyone from Ayn Rand fans to Hart fans to Deadheads might identify.

Steve Jobs, not yet thirty, had just become the sort of emblematic generational avatar that Gary Hart pretended he wasn’t desperate to be. In a 1984 interview, Jobs bragged about his vast wealth—“at 23, I had a net worth of over a million…and at 25, it was over $100 million”—and about his indifference to it: “I’m the only person I know that’s lost a quarter of a billion dollars in one year.” Jobs didn’t mind coming across as a jerk, just not a standard
business
jerk—because he was “well-grounded in the…sociological traditions of the ’60s,” like other Silicon Valley baby boomers.

“There’s something going on here,” he said, “there’s something that is changing the world and this is the epicenter. It’s probably closest to Washington during the Kennedy era or something like that. Now I start sounding like Gary Hart.”

“You don’t like him?” the interviewer asked.

“Hart? I don’t dislike him. I met him about a year ago and my impression was that there was not a great deal of substance there.”

“So who
do
you want to see—”

“I’ve never voted for a presidential candidate. I’ve never voted in my whole life.” Meaning he’d chosen not to vote against Ronald Reagan in 1980.

Jobs was extreme—in devotion to his work, in arrogance and self-satisfaction, in wealth, in lack of interest in electoral politics or sympathy for the unfortunate—but he was also archetypal. For a decade, politics and social policy had been the passionate and unavoidable topic A in America because of the struggles over racial justice and the Vietnam War, but once those problems were addressed and solved, respectively, only politics geeks remained fully engaged. For the remainder of the century, the issues that aroused liberal and left passions in a major way—nuclear weapons, civil wars in Central American countries, the AIDS epidemic—were intermittent and never directly concerned the U.S. political economy.

People like Steve Jobs, or at least people like me, who did vote, always for Democrats, weren’t
anti
union or
anti
welfare or
anti
government. The probability that elected Democrats would tend to increase my taxes wasn’t a reason I voted for them, but my indifference to the financial hit—like Steve Jobs!—felt virtuous, low-end noblesse oblige. However, even after the right got its way on the political economy, many people like me weren’t viscerally, actively skeptical of business or Wall Street either. Big business—in my case, various media and entertainment companies—paid me well and treated me fine, which probably didn’t sharpen my skepticism toward a political economy that was being reordered to help big business (and people like me). When it came to the millions of losers, I felt…grateful that
my
work couldn’t be automated or offshored or outsourced, and I figured,
Creative destruction, invisible hand, it’ll work itself out,
and voted for liberal politicians who said we should retrain steelworkers to become computer programmers. In
Spy,
we didn’t avoid politics—we published Washington-themed issues and an investigative piece about Paul Manafort and Roger Stone called “Publicists of the Damned” and a cover story called “1,000 Reasons Not to Vote for George Bush”—but it wasn’t the magazine’s main focus.

Very few people I knew voted for Reagan, but given that he didn’t do anything
crazy
and started making peace with the Soviet Union, affluent college-educated people, liberals and otherwise, didn’t disagree very ferociously about politics in the 1980s and ’90s, and certainly not about economics. In retrospect, that rough consensus looks like the beginning of an unspoken class solidarity among the bourgeoisie—nearly everyone suspicious of economic populism, but some among us, the Republicans, more suspicious than the rest. Affluent college-educated people, Democrats as well as Republicans, began using the phrase
socially liberal but fiscally conservative
to describe their politics, which meant low taxes in return for tolerance of…
whatever,
as long it didn’t cost affluent people anything.
*
It was a libertarianism lite that kept everything nice and clubbable and, unlike Republican conservatism, at least had the virtue of ideological consistency.

To their great credit, the New Democrats took the lead early to start making people aware of CO
2
-induced global warming. In 1980, Senator Paul Tsongas conducted the first congressional hearing on the subject, and Congressman Al Gore convened the next in 1981. It was a wonky
new
problem that required
new
solutions. Of course, back then, briefly, it also had the New Democrat appeal of not seeming anticapitalist. And if the Democrats’ union allies warned that policies to reduce CO
2
might be bad for industrial jobs, they were just being their shortsighted uneducated-palooka selves.

When Hart ran a second time for president in 1988, one of his tax policy advisers was Arthur Laffer, the inventor of supply-side economics. When Jerry Brown ran for the 1992 Democratic nomination, he also sought Laffer’s help to devise some kind of tax scheme

that was clear and easy to articulate,” and Laffer himself says he voted for Bill Clinton.

Clinton was one of the founders in 1985 of the Democratic Leadership Council. It was an attempt by an endangered species, white Southern Democratic politicians, to remain relevant twenty years after their die-off had begun. It also became a think-tankish anchor for Democrats who didn’t disagree with Republicans that the
only
acceptable new solutions to
any
social problem were market-based.

For the remainder of the century, no candidate from the Democratic left became a plausible finalist for the nomination. In the 1988 primary, Jesse Jackson ran as a full-on leftist, calling for single-payer healthcare, free community college, a big federal jobs program, and the cancellation of Reagan’s tax cuts for the rich—and by sweeping the black South and winning everywhere among voters under thirty, he beat Joe Biden and Gore and came in second to Dukakis, but…he was never going to be nominated. A Vermont mayor who’d endorsed him, Bernie Sanders, was elected to the House in 1990 as a
socialist,
cute, but really, so what? He was a quirky retro figure, some Ben & Jerry’s guy who didn’t realize the 1960s were over and was channeling Eugene Debs and Norman Thomas from the ’20s and ’30s. In 1992, when Clinton won the nomination, his only serious competitors were two fellow New Democrats, Brown and Tsongas. Democrats had settled into their role as America’s economically center-right party. There was no organized, viable national economic left in the vicinity of power.

BOOK: Evil Geniuses: The Unmaking of America: A Recent History
9.82Mb size Format: txt, pdf, ePub
ads

Other books

Brother Sun, Sister Moon by Katherine Paterson
Twelve Days by Teresa Hill
A War Like No Other by Fiss, Owen
Burned by Jennifer Blackstream
Controversy Creates Cash by Eric Bischoff
A French Wedding by Hannah Tunnicliffe
Bad Girl by Blake Crouch