Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity (8 page)

BOOK: Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity
13.16Mb size Format: txt, pdf, ePub

As digital labor scholar and activist Trebor Scholz has pointed out,
41
in crowdsourcing there’s no minimum wage, no labor regulation, no governmental jurisdiction. Although 18 percent of workers on Amazon Mechanical Turks are full-time laborers, most of them make less than two dollars an hour. Amazon argues that the platform is all about choice and empowerment, that workers can “vote with their feet” against bad labor practices. But when even minimum-wage jobs aren’t available to many workers today, they are empowered to make only one choice or none at all.

The other answer—one I’ve argued myself—is for displaced workers to learn code. Anyone competent in languages such as Python, Java, or even Web coding such as HTML and CSS is currently in high demand by businesses that are still just gearing up for the digital marketplace. However, as coding becomes more commonplace, particularly in developing nations such as India, we find a lot of that work being assigned piecemeal by computerized services such as Upwork to low-paid workers in digital sweatshops. This is bound to increase. The better opportunity may be to use that code literacy to develop an app or platform oneself, but this means competing against thousands of others doing the same thing in an online marketplace ruled by the same power dynamics as the digital music business.

Besides, learning code is hard, particularly for adults who don’t remember their algebra and haven’t been raised thinking algorithmically. Learning code well enough to be a competent programmer is even harder. Although I certainly believe that any member of our highly digital society should be familiar with how these platforms work, universal code literacy won’t solve our employment crisis any more than the universal ability to read and write would result in a full-employment economy of book publishing.

It’s actually worse. A single computer program written by perhaps a dozen developers can wipe out hundreds of jobs. Digital companies employ ten times fewer people per dollar earned than traditional companies.
42
Every time a company decides to relegate its computing to the cloud, it is free to release a few more IT employees. Most of the technologies we are currently developing replace or obsolesce far more employment opportunities than they create. Those that don’t—technologies that require ongoing human maintenance or participation to work—are not supported by venture capital for precisely this reason. They are considered unscalable because they require more paid human employees as the business grows.

Finally, there are jobs for those willing to assist with our transition to a more computerized society. As employment counselors like to point out,
self-checkout stations may have cost you your job as a supermarket cashier, but there’s a new opening for that person who assists customers having trouble scanning their items at the kiosk, swiping their debit cards, or finding the SKU code for Swiss chard. It’s a slightly more skilled job and may even pay better than working as a regular cashier. But it’s a temporary position: soon enough, consumers will be as proficient at self-checkout as they are at getting cash from the bank machine, and the self-checkout tutor will be unnecessary. By then, digital tagging technology may have advanced to the point where people just leave stores with the items they want and get billed automatically.

For the moment, we’ll need more of those specialists than we’ll be able to find: mechanics to fit our current cars with robot drivers, and engineers to replace medical staff with sensors and to write software for postal drones. There will be an increase in specialized jobs before a precipitous drop. Already in China, the implementation of 3-D printing and other automated solutions is threatening hundreds of thousands of high-tech manufacturing jobs, many of which have existed for less than a decade.
43
American factories would be winning back this business but for a shortage of workers with the training necessary to run an automated factory. Still, this wealth of opportunity will likely be only temporary. Once the robots are in place, their continued upkeep and a large part of their improvement will be automated as well. Humans may have to learn to live with it.

It’s a conundrum that was first articulated back in the 1940s by Norbert Wiener, the inventor of cybernetics and the feedback mechanisms that turned plain old machines into responsive, decision-making robots. Wiener understood that in order for people to remain valuable in the coming technologized economy, we were going to have to figure out what we can do—if anything—better than the technologies we have created. If not, we were going to have to figure out a way to cope in a world where robots tilled the fields. His work had influence. In the 1950s, members of the Eisenhower administration began to worry about what would come after industrialism, and by 1966 the United States convened the first (and only)
sessions of the National Commission on Technology, Automation, and Economic Progress. The six volumes it published were largely ignored, but they did serve as the basis for much of Daniel Bell’s highly regarded work in the 1970s about what he called the “post-industrial economy.” His main recommendation was to make our technological progress less “random” and “destructive” by matching it with upgraded political institutions.
44

Today, it’s MIT’s Brynjolfsson and McAfee who appear to be leading the conversation about technology’s impact on the future of employment—what they call the “great decoupling.” Their extensive research shows, beyond reasonable doubt, that technological progress eliminates jobs and leaves average workers worse off than they were before. “It’s the great paradox of our era,” Brynjolfsson explains. “Productivity is at record levels, innovation has never been faster, and yet at the same time, we have a falling median income and we have fewer jobs. People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.”
45

However, in light of what we know about the purpose of the industrial economy, it’s hard to see this great decoupling as a mere unintended consequence of digital technology. It is not a paradox but the realization of the industrial drive to remove humans from the value equation. That’s the big news: the growth of an economy does not mean more jobs or prosperity for the people living in it. “I would like to be wrong,” a flummoxed McAfee explained to
MIT Technology Review
, “but when all these science-fiction technologies are deployed, what will we need all the people for?”
46

When technology increases productivity, a company has a new excuse to eliminate jobs and use the savings to reward its shareholders with dividends and stock buybacks. What would have been lost to wages is instead turned back into capital. So the middle class hollows out, and the only ones left making money are those depending on the passive returns from their investments.

Digital technology merely accelerates this process to the point where we can all see it occurring. As Thomas Piketty’s historical evidence
reveals, the ever-widening concentration of wealth is not self-correcting. Capital grows faster than the rest of the economy. Or, in even plainer language, those with money get richer simply because they have money. Everyone else—those who create value—gets relatively poorer. In spite of working more efficiently—or really because of it—workers get a smaller piece of the economic pie.

This income disparity is not a fact of nature or an accident of capitalism, either, but part of its central code. Technology isn’t taking people’s jobs; rather, the industrial business plan is continuing to repress our ability to generate wealth and create value—this time, using digital technology. In other words, the values of the industrial economy are not succumbing to digital technology; digital technology is expressing the values of the industrial economy. The recent surge in productivity, according to Piketty, has taken this to a new level, so that the difference between capital and labor—profit and wages—is getting even bigger.
47
Leading-edge digital businesses have ten times the revenue per employee as traditional businesses. Those who own the platforms, the algorithms, and the robots are the new landlords. Everybody else fights it out for the remaining jobs or tries to squeeze onto the profitable side of the inevitable power-law distribution of freelance creators.

But the beauty of living in a digital age is that the codes by which we are living—not just the computer codes but all of our laws and operating systems—become more apparent and fungible. Like time-elapsed film of a flower opening or the sun moving through the sky, the speed of digital processes helps us see cycles that may have been hidden to us before. The fact that these processes are themselves comprised of code—of rules written by people—prepares us to intervene on our own behalf.

THE UNEMPLOYMENT SOLUTION

A good programmer always begins with the question What problem are we trying to solve? So let’s look at our situation from the digital perspective: Are we looking for new ways to grow the economy? Or are we trying
to figure out how to get people jobs? Sure, it’s a better goal than abstract, senseless, environment-depleting growth. But is it the ultimate aim here? Is this the most foundational question we can ask?

Perhaps so. Both the business and the technology press are filled with stories about how computers and robots change employment. In politics, almost any issue comes down to an argument to create jobs. War, immigration, housing, energy, budget, fiscal, and monetary policy debates all find their footing in employment for Americans: How do we get people back to work? How do we bring jobs back from overseas? How does the price of oil affect jobs? How do we raise the minimum income without its costing any jobs? How can we retrain our workforce for the jobs of tomorrow? It’s as if the highest moral good and core human need is jobs.

I’m not so sure it should be. People want stuff. They want food, shelter, entertainment, medical care, a connection to others, and even a sense of purpose. But employment—a job one goes to, clocks in, does some work, clocks out, and returns home—isn’t really high on the hierarchy of needs for most of us. Dare we even admit it, but who really
wants
a job? We are convinced that unemployment is necessarily a bad thing. Free-market advocates use high unemployment figures as proof that Keynesian-style government spending doesn’t really move the needle. Leftists use the same figures to show that corporate capitalism has reached its endpoint: investors make money in the stock market while real people earn less income, if they can find jobs at all.

The seemingly endless “jobless recovery” makes no sense at all, particularly at a time when many of us are working longer hours as overextended freelancers or the nominally unemployed than we did when we had real jobs. It’s hard to imagine how this all looks to young people just graduating college, who now chase unpaid internships with more energy than those in previous generations sought paying work.

But what if joblessness were less of a bug than a feature of the new digital economy?

We may, in fact, be reaching a stage of technological efficiency once imagined only by science-fiction writers and early cyberneticists: an era
when robots really can till the fields, build our houses, pave our roads, and drive our cars. It’s an era that was supposed to be accompanied by more leisure time. After all, if robots are out there plowing the fields, shouldn’t the farmers get to lie back and enjoy some iced tea?

Something is standing in the way of our claiming the prosperity we have created. The toll collector whose job is replaced by an RFID “E-ZPass” doesn’t reap the benefit of the new technology. When he can’t find a new job, we blame him for lacking the stamina and drive to retrain himself. But even if he could, digital solutions require, on average, less than one tenth the human employees of their mechanical-age predecessors. And what new skill should he go learn? Even the experts and educators have little idea what gainful employment will look like just five years from now.
*

In fact, jobs are a relatively new approach to work, historically speaking. Hourly-wage employment didn’t really appear until the late Middle Ages, with the rise of the chartered corporation.
48
Craftspeople were no longer allowed to make and sell goods; they had to work for these protocorporations instead. So people who once worked for themselves now had to travel to the cities and find what became known as “jobs.” They no longer sold what they made; they sold their time—a form of indentured servitude previously known only to slaves. The invention of the mechanical clock coincided with this new understanding of labor as time and made the buying and selling of human hours standard and verifiable.

The time-is-money ethic became so embedded in our culture that putting in one’s hours now feels like an essential part of life. What do you
do
? Yet jobs were not invented to give us stable identities. They were simply a part of the growth scheme: a way to monopolize the creative innovation and hard labor of the earlier free marketplace. Now that the labor is no longer needed—or is so easily accomplished by machines—must we still keep the jobs?

Not to work feels unethical. Even our society’s favorite billionaires are “self-made,” which, in a reversal of aristocratic values, lends an air of respectability to their wealth that passive inheritors now lack. But if we can separate the notion of employment from that of making a valuable contribution to society, a whole lot of new possibilities open up for us.

Our industrial capabilities have surpassed our requirements. We make more stuff than we can use, at least here in the developed world. Even middle-class Americans rent storage units for their extra stuff. Our banks are tearing down foreclosed homes in multiple U.S. states in order to prevent market values from declining.
49
Our Department of Agriculture is storing, even burning, surplus crops to stabilize prices for industrial agriculture.
*
There is more than enough to go around.
50
Why don’t we give those houses to the homeless, or that food to the hungry?

Because they don’t have jobs. Letting them just
have
stuff does not contribute to the great growth imperative.

Instead, we’re supposed to think of new, useless things for these folks to make, then market those things to the rest of us, so that we go buy them, dispose of them, and then create more landfill. All in the name of growth. It’s as if we expect consumers to fuel the production of unnecessary goods just so that people can put in more hours of work they’d rather not be doing. We’re not looking to create jobs because we need more things. We employ people because otherwise we have no way to justify letting them share in a bounty created without their labor.

To most of us, this is just “the way things are,” and to question the arrangement goes against centuries of precedent. Fortunately, all the reasons against overturning the scheme are based solely on the growth requirements of the industrial economic operating system—not on reality. Alternatives to the dehumanization scheme and its impact on work in the twenty-first century and beyond require challenging the underlying
assumptions of this system and drawing more-direct lines between what people need and what they can provide. Here are a few possibilities, presented less as fully fleshed-out policies ready to be implemented in one nation or the other than as examples of some of the kinds of thinking we need to be able to do and some of the sacred truths we must be willing to reevaluate. Underlying them all is the implicit suggestion that our biggest challenge may be learning how to say “enough.”

1. Work Less: Reduce the 40-Hour Workweek

We generally start any conversation about employment with the holy 40-hour workweek and work back from there, retrofitting the rest of our business and economic metrics to this fixed value. It’s time we accept the truth: we have gotten so efficient at production that we don’t really need everyone employed 40 hours a week anymore. We have to remap our time and labor in a way that’s appropriate for a postindustrial society. This does not have to happen all at once, but we do have to develop a path toward less work.

Early efforts have been very promising for business and people alike. Juliet Schor, a sociologist at Boston College, believes we must overcome our fear of appearing fanciful or naïve and get on with the business of reducing work hours.
51
Her research shows that more working hours do not lead to a better economy, a better environment, or a better quality of life. Countries that have just begun instituting worktime reduction already have smaller carbon footprints than those that haven’t. Schor has also shown how spending fewer hours on the job frees people to pursue the sorts of things they already do for free and that ultimately contribute even more greatly to the economy—from caring for the sick to teaching children. In the words of New Economics Foundation researcher Julia Slay, “What would the cost to your business be if your workers were never potty trained?”
52
Such value is treated as subservient to the money economy, when it is simply labor unrecorded or, as Lanier would put it, off the books.

Shortening the workweek has a profound effect on many
interdependent systems. People have time to do things more slowly, such as walking to work, which uses less carbon. Shortening the workweek gives more people the opportunity to share available work, a form of engagement and participation that improves mental health and creates social bonds.
53
It also reduces overtime and work overload, both of which are statistically linked to mental illness and cancer. Other studies show that working fewer days promotes more civic and community engagement. People’s perception of themselves as “citizens” and their time commitment to social issues
54
both increase.

Even in the United States, recent experiments in shortening the workweek have panned out better than expected. In 2008, Utah instituted a four-day working week for public employees by offering them the opportunity to shift from five 8-hour days to four 10-hour days. Fifty percent of the 18,000 people who participated reported that they were more productive, while a full 80 percent asked to maintain the new schedule after the experiment was over, citing benefits to their relationships, families, and general well-being. The reduction in overtime payments and absenteeism saved the state $4 million and reduced carbon emissions by 400,000 metric tons that year. And this was with no reduction in actual hours.
55
In California, Amador County workers initially protested when their worktime was reduced 20 percent, from five days to four, in order to justify a 10 percent reduction in their pay. Two years later, when they were offered the option of going back to a 40-hour workweek, 79 percent voted to stay at the reduced hours and pay.
56

Just how strange would it be for successfully automating businesses to phase out work or at least wind down the hours? How about doing it without reducing employee participation in the profits? Not surprisingly, digital companies are some of the first to experiment with shorter weeks that don’t punish employees. Treehouse, an online education startup, adopted a four-day workweek and grows by an average of 120 percent a year.
57
Productivity platform Basecamp has also instituted a four-day week because, as CEO Jason Fried explains, “when there’s less time to do work, you waste less time. . . . You tend to focus on what’s important.”
58
The
Basecamp platform has become an industry standard in the startup community, so maybe its approach to enterprise will spread as well.

2. Rewrite the Employee-Company Contract: Share Productivity Gains

What’s most important, even more important than the increased worker efficiency enjoyed by companies with shorter weeks, is the improvement in the health, well-being, and satisfaction of the human beings these companies were built to serve. While passive investors should enjoy the benefits of increasing productivity, so, too, should those who invested sweat equity. Most companies still use increased productivity as an excuse to cut jobs and then pay the savings back to the shareholders as dividends or stock buybacks. It’s Corporatism 101, but ultimately a flawed, short-term approach—especially when productivity gains are spread across so many industries at once. Companies are amputating their human resources while also spoiling their own and everyone else’s customer base by taking away their jobs. And all the while, digital productivity gets blamed for the obsolete business model it’s accelerating.

Firms willing to consider changing previously unmovable pieces of the puzzle, such as work hours, also gain a competitive advantage in attracting and retaining the best talent. (You want to work Tuesdays, Wednesdays, and alternate Thursdays? No problem!) Moreover, keeping a reserve of available hours positions a company to take advantage of sudden bursts in activity, or a rush of new contracts, without having to hire and train new employees (only to fire them a few months later). In digital parlance, this means the company is more “resilient.” It is a less brittle strategy in that it distributes the available work hours to many people instead of overemploying some and unemploying everyone else.

If, thanks to a new technology, workers become much more productive, a company doesn’t have to fire a bunch of them and pass all the savings up to the shareholders. It can instead share the spoils with those workers or—if accelerated productivity outpaces demand—pay them the same salary to work fewer days.

A reduction in workdays is just one of many possible ways to contend with a paucity of available jobs. LinkedIn founder Reid Hoffman envisions digital technologies (like his networking platform) enabling people to abandon the end-to-end employment solutions of yesteryear and adopt a more temporary, improvisational approach to their careers.
59
Instead of seeking a job and then giving years to an employer in return for money, professionals will engage with companies for a specific purpose—more like a campaign. These “alliances” will last an average of eighteen months, during which a new product or division might be launched, a financial problem rectified, or a creative challenge solved. The project itself becomes part of the worker’s portfolio, and the worker is engaged less as an employee than as a partner in the project.

The devil is always in the details: Isn’t this a recipe for exploitation? When everyone’s essentially a work for hire, what happens to the collective bargaining power once offered by labor unions? Would COBRA cover people’s health insurance between engagements that might be years apart? What about pensions? Again, imagination and flexibility are required. New forms of organized labor—like the Freelancers Union—will emerge, and older, preindustrial ones like guilds will likely be retrieved. These sorts of changes don’t happen overnight but incrementally and after much trial and error.

Other books

Paciente cero by Jonathan Maberry
Dark Fire by C. J. Sansom
People Who Knew Me by Kim Hooper
The Next President by Flynn, Joseph
Ever Tempted by Odessa Gillespie Black
The Candy Man Cometh by Dan Danko, Tom Mason