Read The Rights of the People Online
Authors: David K. Shipler
My behavior changed when I lived in Moscow. According to my ethics as a correspondent, I was doing nothing “wrong” by going to see political dissidents or taking them books that were banned by Soviet authorities. But to protect them, Debby and I kept a Magic Slate handy in our bugged apartment. Whatever we wished to discuss without the KGB hearing—mostly plans to visit one or another dissident—we wrote with a stylus, then erased by lifting the slate’s plastic sheet. Or we used hand signals to make the first letters of our dissident friends’ names. We were circumspect on the phone, received our mail through the U.S. Embassy’s diplomatic pouch (a privilege granted to journalists even though we weren’t diplomats), and spoke most freely while outside in Moscow’s expansive parks. We did not quite realize how thoroughly the anxiety over surveillance had filtered into us during our four years there until we left and it began to drain away. It took a year or more before I lost all hesitation about speaking openly at home—as long as it took Brandon Mayfield, the Portland lawyer whose house was secretly searched and bugged by the FBI.
Technology was relatively primitive during my Moscow days in the late 1970s. The KGB was rumored to be bombarding the embassy with microwaves to eavesdrop on conversations by picking up vibrations from window glass, and it could reportedly monitor electric typewriters by remotely detecting which keys were pressed. Compared to today, however, those were the Dark Ages.
Agents followed people around (often clumsily), penetrated campuses and workplaces with informers, tapped phones, opened mail, and drew conclusions about citizens from fragmented reports by party loyalists. In this low-tech police state, the information was “stored on millions of yellowing pieces of paper, typed or handwritten,”
The Economist
observed in 2007. “These days, data about people’s whereabouts, purchases, behavior, and personal lives are gathered, stored, and shared on a scale that no dictator of the old school ever thought possible.” The magazine published a photograph that seemed positively nostalgic: an East German clerk standing in a narrow aisle between towering shelves, filing stacks of folders bulging with dossiers—on paper.
32
There is no telling whether the paper files were more or less accurate than their electronic descendants, but today’s caricatures are more easily disseminated, and some are plainly erroneous, fingering the wrong people and missing the right ones. That happens when digital data are collected on a whim without showing grounds for suspicion, and when government abandons the rigorous discipline imposed by the strictures of probable
cause, adversary proceedings, and the dispersal of authority, all of which are designed to enhance the truth-finding process. The immense force of the state to prosecute and imprison individuals, confiscate their property, or merely keep them off airplanes is easily misused unless it is checked and balanced.
What has been colloquially called the no-fly list, for example, is designed to flag dangerous passengers for either close searches and questioning or outright denial of permission to board. But the errors that have occurred, the infliction of penalty without due process or appeal, and the dangerous characters it misses, have made the list a laughingstock. Trying to fix it, the government has been floundering around for years.
CAPPS, the Computer-Assisted Passenger Profiling System (later changed to “… Prescreening System”), relied on private commercial-data providers. A ticket purchaser’s name, address, phone number, and date of birth would be transmitted to the company to confirm her identity, and the company would send back a score on her risk level: green would get normal security checks, yellow would get closer screening, red would be denied boarding and referred to law enforcement. How these scores were determined was an obscure secret, and CAPPS was so flawed that it was later replaced by a supposedly improved system, CAPPS II.
When CAPPS II failed to meet congressionally imposed standards, though, it was supplanted in 2004 by a system called Secure Flight, whose similar flaws led to its suspension in 2006. By 2010 it was being revised and revived. Government operatives seemed less skilled at administering than at naming; they masterfully composed titles that invoked fear or patriotism. Whenever a program provoked protests, you could be sure it would get a new name containing “terrorism,” “prescreening,” “secure,” or another trigger word to soothe or scare you into giving up freedoms to the state as noble guardian of safety.
Luckily, enough Americans still bridled at the privacy invasions that class-action lawsuits were directed against airlines, which—like communications companies—became enablers of government surveillance. JetBlue, it was disclosed, provided personal information on 1.5 million passengers to a defense contractor, Torch Concepts, which was researching data-mining techniques for the army. American Airlines sent private data on 1.2 million customers to four firms competing for a contract with the Transportation Security Administration.
The most potent argument against CAPPS was that it was so often wrong. On five occasions, Senator Edward “Ted” Kennedy was initially rejected for flights because “T. Kennedy” turned up on the list as a
pseudonym used by a suspected would-be terrorist. The real Kennedy described this encounter with a ticket agent: “He said, ‘We can’t give it to you. You can’t buy a ticket to go on the airline to Boston.’ I said, ‘Well, why not?’ He said, ‘We can’t tell you.’ ” A supervisor finally intervened and let him board, the senator told a Judiciary Committee hearing.
“Tried to get on a plane back to Washington,” he continued. “ ‘You can’t get on the plane.’ I went up to the desk and said, ‘I’ve been getting on this plane, you know, for forty-two years. Why can’t I get on the plane?’ ”
33
Again, a supervisor recognized the veteran legislator and overruled the computerized list, which had been alerting authorities to many other sinister figures. They included Representative Jim Davis, a Florida Democrat; two San Francisco peace activists, Jan Adams and Rebecca Gordon; and numerous Americans named David Nelson, who were put through the wringer whenever they checked in. The real David Nelson on the list was reportedly a white Irishman, but the database flagged a black David Nelson, a Milwaukee Brewers coach, who had to spend at least forty-five minutes before every flight under questioning by local police and FBI agents. A business consultant named David Nelson was pulled off a plane to be interrogated on the ramp. Another David Nelson, a graduate student, got a quizzical look from every ticket agent whose screen flashed some secret message that she could not reveal to him. She invariably made a flurry of phone calls, then the police appeared, intrusive questions were posed, and Nelson may or may not have made his flight. The actor David Nelson from
Ozzie and Harriet
was stopped a few times, as was a nine-year-old David Nelson from Alaska. Feel safer?
An ACLU lawsuit on behalf of Adams and Gordon dislodged about three hundred pages of government documents, including the admission that names were placed on the list using “necessarily subjective” criteria and “not hard and fast rules.” The number of people on the list rose from just 16 on September 11, 2001, to more than 400 the next day, 594 by that December, about 1,000 a year later and, the ACLU estimated, tens of thousands by 2006. “In November of 2005, the TSA”—the Transportation Security Administration—“indicated that 30,000 people in the last year alone had contacted the agency because their names had been mistakenly matched to a name on the federal government’s watch lists,” the ACLU reported.
34
In addition, a database maintained by the Terrorist Screening Center, which alerts local police, border patrol agents, and U.S. consulates when somebody’s name matches one on watch lists, went from about 170,000 individuals in July 2004 to 300,000 by April 2007.
35
By the time Umar
Farouk Abdulmutallab tried to blow up that Northwest flight to Detroit on Christmas Day 2009, the list contained an unwieldy 550,000 names, with subgroups of 14,000 listed for secondary screening and 4,000 truly designated as “no-fly.”
After the bombing attempt, concern naturally focused on oversights in placing potentially dangerous people on the lists, and top intelligence officials expressed regret over giving in to “pressure” by imposing curbs on adding names.
36
The comments seemed to forecast a rapid growth in the no-fly list, but it hadn’t grown quite enough by the following May, when a Pakistani-American named Faisal Shahzad nearly succeeded in flying from Kennedy Airport after parking an inexpertly assembled car bomb that fizzled in Times Square. His name was added to the list only a few hours before Emirates airline personnel, doing their screenings from an earlier version, ignored an electronic notification to check for an update and allowed him onto Flight 202, bound for Dubai. At the last minute, officials spotted him on the passenger manifest that airlines send routinely to Customs and Border Protection just before departure. The plane was still at the gate. Agents boarded and arrested him.
A more widespread problem has been the inclusion of innocents. At least nine U.S. citizens, three of them military veterans, were stranded overseas in 2010 after being placed on the no-fly list while traveling or studying abroad—in Colombia, the United Kingdom, the Virgin Islands, Germany, Egypt, Yemen, and Saudi Arabia. They could not enter the United States by air. One of them, Stephen Durga Persaud, eventually made it by ship from the Virgin Islands to Miami, according to their complaint filed in federal court, then by train to Los Angeles in time to join his wife there for the birth of their second child. Another, Steven William Washburn, spent fifty hours flying from Dublin to Mexico City via Germany, Brazil, and Peru, then hours more being interrogated by Mexican officials before finally crossing by land to the United States, the country of his birth. But others, more distant or without extensive funds, remained stuck abroad until months after they sued the Obama administration.
The government then agreed to a one-time waiver for each of them, said Ben Wizner of the ACLU, which handled their case. Facing a court challenge, officials seemed willing to defend the no-fly list but not the exclusion of Americans from their homeland. They would be allowed to fly back. But they would not be told if they remained on a watch list that could bar them from leaving, entering, or traveling inside the United States by air. While the litigation continued, those who had not managed to get home by land or sea took the deal and returned, except for two students
who chose to avoid the risk. They were halfway through a two-year Arabic course at a Saudi university and could not get the U.S. government to tell them whether they would be permitted to fly back to Saudi Arabia to complete their studies. What had led to listing them and the others—whether error or superficial profiling or unverified intelligence—the government would not reveal, but at least into late 2010, none of the nine had been charged with a crime. They were punished nonetheless, for the ban on flying shackled their lives, and it came without accusation, proof, or the semblance of due process. The men had no idea how to respond to invisible allegations embedded in government databases.
Zealous to deny terrorists enough information to foil this “system,” the federal government denies citizens the basic information to judge its effectiveness or repair its shortcomings. With no scrutiny of how someone gets onto a list, what information contributes to the labeling of people as nefarious characters, and what procedure can be used to correct the record, there are no brakes to impede innocent error, political targeting, and the disruption of lives based on ethnic or religious identity. Without a chance to see the collected information and what agents conclude from it, neither the Congress nor the courts nor the public can tell whether innocents are being pursued, whether the precious resources of law enforcement are chasing around fruitlessly. After investigating itself, the Department of Homeland Security found its procedures to remove names listed incorrectly to be wholly inadequate.
A healthy correction by the judicial branch stopped the executive branch in 2007 from using a severely flawed database to ferret out illegal immigrants. The plan required employers to submit Social Security numbers provided by their employees for verification against a government master file. It seemed straightforward enough. The government would issue “no-match letters” when workers provided phony numbers, and employers would be required to fire them within ninety days or face prosecution. But the database was a mess. Sampling showed that false alerts would have been generated on many legal residents and U.S. citizens, for the 435 million individual records held by the Social Security Administration contained 17.8 million with errors, according to the agency’s own inspector general, including 12.7 million for native-born Americans. The Bush administration’s program, which would have sacrificed the well-being of law-abiding citizens as collateral damage to detect undocumented aliens, was halted by a federal judge. President Obama’s administration dropped the no-match rule after he took office.
37
In response to concerns about innocents being wrongly targeted, one
might argue that it’s the partial picture that leads to mistakes and that
more
intrusive surveillance would provide more complete profiles and reduce the error rate. Technology is bound to improve, mining data more reliably, connecting dots more accurately. It’s an alluring prospect—that the pieces of ordinary life can be “aggregated,” as intelligence types say, into revealing mosaics. But having accurate raw data solves only the first of three basic problems that concern Jim Dempsey of the Center for Democracy and Technology. There can also be false inference. “You can have perfect data,” he says, “but perfect data plus false inference equals bad outcome.” Finally, there can be intentional abuse and political monitoring: “He opposes the war in Iraq, he must be a security threat.” This has certainly been the mentality of law enforcement when monitoring peace groups and policing demonstrations.