Unfair (23 page)

Read Unfair Online

Authors: Adam Benforado

BOOK: Unfair
3.9Mb size Format: txt, pdf, ePub

The problem, as scholars have pointed out, is not that the justices are trying to become more informed. It's the nature of the information they turn up.
In many cases, the “facts” they discover are flawed or misleading.

Judges, just like the rest of us, tend to make gut decisions and then look for supporting data, discarding and dismissing conflicting evidence along the way. It's the same problem we encountered with the police, emergency responders, and medical personnel who focused on the facts that fit the initial conclusion that David Rosenbaum was just a drunk.
When judges do research, they already have an idea of what they are looking for and—surprise!—they tend to find it.
The underlying drive is to bolster an argument, not discover the truth.

Suppose you are a justice looking, for the first time, at the details of the
Sykes
case. Your immediate instinct (no doubt informed by watching the video of the police chase and fiery crash in
Scott v. Harris
four years earlier) is that of course fleeing the police in a vehicle is a violent felony. But you need a reason to justify that position, and so your brain offers a rationalization: lots of people are injured and killed during pursuits. It seems like that has to be true; you just need to find the data that says so. And so you go online to look for sources and you search until you find just such a study. There: proof that people are injured and killed when police chases occur, which in turn establishes that vehicular flight crimes are indeed violent felonies. It felt like the conclusion was dictated by the facts, when really a gut reaction led you to engage in a narrow, targeted search.

We tend to assume that the more data a person has at her fingertips, the more accurate she'll be.
But, in fact, having more information may make it easier to find the necessary support for an erroneous proposition.
When a pair of political scientists asked a group of Republicans whether “the size of the yearly budget deficit increased, decreased, or stayed about the same during Clinton's time as President”—it, in fact, decreased—the most politically informed members (those in the 95th percentile) gave a wrong answer more often than less informed members (those in the 50th percentile).
A similar effect was found for well-informed Democrats asked about the state of inflation during President Reagan's time in office.
With more information on hand to support the intuition that the other side's president had been a failure, it was easier to reach the wrong (but favored) conclusion.

Our analytical skills can be distorted by a similar dynamic: sometimes being more adept at evaluating something can actually amplify our ideological biases.
In one set of experiments, researchers looked at how people with different levels of math competency assessed the effectiveness of a skin-rash treatment or a firearm regulation when given basic data.
On the skin-rash evaluation, things played out exactly as you might expect: those who were bad at math got the right answer about half as often as those who were good at math.
But when participants were asked to determine the effectiveness of the gun ban, something funny happened with the highly numerate.
When the data pointed to a conclusion that conflicted with their ideology, they appeared to disregard it.
Given numbers suggesting that crime decreased, mathematically adept conservatives got the right answer only about 20 percent of the time, as compared with 85 percent of the time when the data suggested that crime increased.
The reverse was true for liberals: about 70 percent reached the correct conclusion when the data pointed to a decrease in crime, but that dropped to below half when the data implied that the ban was ineffective.
Despite knowing how to use the numbers to make an
accurate determination, those with conflicting ideological positions simply went with their gut. The findings suggest that being a more skilled and experienced member of the bench might not bring the benefits we'd expect.

The situation may be exacerbated by the fact that when it comes to the controversial issues that come before the Supreme Court, there are almost always authorities to buttress any position one might want to take.
Indeed, when Justice Elena Kagan sought support for her dissent in
Sykes
—that fleeing the cops in a car is
not
inherently violent and aggressive—she was easily able to find it, citing evidence that a driver might legitimately fear that a criminal rather than a police officer was pulling her over. The fact that justices' research may be driven more by motivated reasoning than by an open-minded quest for information is reflected in the diversity of sources that justices cite.
Look at recent opinions and you'll see interest-group sites and blogs alongside esteemed peer-reviewed journals.

It doesn't help that judges are exposed to a surprisingly narrow set of ideas, experiences, and viewpoints in their daily interactions. Sure, judges are not sequestered in ivory towers.
They have spouses, children, and friends; they attend cookouts and weddings and plays; they read books, watch movies, and go on vacations. But, like all of us, they fall into routines, sticking to what they already know, prefer, and trust.

Justice Scalia reads two newspapers in the morning: the
Wall Street Journal
and the
Washington Times
.
As he told a journalist for
New York
magazine, he “used to get the
Washington Post
, but it just…went too far for me. I couldn't handle it anymore.”
He was tipped over the edge by “the treatment of almost any conservative issue. It was slanted and often nasty. And, you know, why should I get upset every morning?”
He “usually” listens to talk radio—that is where he gets most of his news.
In the past, he went to dinner parties that had a real mix of liberals and conservatives, he said, but that hasn't happened in a long time.

We all wear blinders fashioned from our limited lives. And if you happen to live in northern Virginia, listen to NPR, and mingle primarily with liberals, you are going to conduct your judicial research accordingly, clicking on certain websites and not others, recalling particular research studies, reading beyond the abstract of this author's paper but not his colleague's.
And you may surround yourself with clerks who do the same.

A search engine like Google may seem to offer a way out of the bind.
But search engines are themselves deeply biasing.
Many of them create filter bubbles by organizing the results based on your particular interests and proclivities, as revealed by the other websites you visited, your Facebook profile, or other personal details.
In essence, without your awareness, you are being steered toward the sources that you are likely to find the most persuasive and that are most likely to support your views—and away from those that might cause you to rethink your positions.

Amicus curiae briefs—“friend-of-the-court” filings, widely believed to aid the justices by helping to fill informational gaps—are a dead end as well.
Although they often purport to offer impartial counsel, they are advocacy documents with facts chosen to persuade.
And members of the Court draw from them—more than a hundred times between 2008 and 2013—with a startling lack of scrutiny, citing amicus facts backed up by e-mails, research funded by the amicus itself, unpublished studies “on file with the authors,” and, sometimes, nothing at all.
With dozens of amici in certain cases and more each year, it seems as if justices are being given a deep reservoir of knowledge, but all that the system really does is supply an easier way to support preexisting conclusions.

—

As we have seen, much of the bias that infects a judge's decisionmaking is subtle and automatic. And in many cases it is small enough or disguised enough to go unnoticed by others.

It is like having a single step that's ever so slightly higher
than the others.
Until recently, at the 36th Street subway stop in Brooklyn, there was just such a step.
Every day it caused numerous people to trip as they ascended to street level.
But no one did anything, even those who were most severely affected.
The guy who nearly dropped his baby?
The woman who fell to her knees?
They caught their balance or brushed themselves off and walked on, thinking that it was just bad luck or that they had been clumsy or distracted.
Few, if any, blamed the step, and so there it remained, a fraction of an inch off, until someone decided to film the entrance. Suddenly, with a pool of data, the problem was so clear it was comical.
In under an hour, the videographer captured seventeen people stumbling on the step.
And within a day of the evidence being posted online, New York's Metropolitan Transportation Authority had begun replacing the staircase.

We should embrace a similar approach with respect to our courts. Judges need to know if they are more likely to grant parole in the morning than at the end of the day or show more leniency toward a white petitioner than a black one. They need to be aware of how they conduct research and how often they side with the government and whether female attorneys coming before them fare as well as men.
But if no one is keeping careful track of their decisions, how will they see the patterns?

Fortunately, a lot of the monitoring and aggregating machinery already exists. Journalists and academics are in a better position than ever to uncover unequal treatment and distorted outcomes.
It was the
Boston Globe
's analysis of more than fifteen hundred Massachusetts drunk-driving cases, for example, that revealed a gross disparity in verdicts.
In 2010, 82 percent of defendants who selected a bench trial before a judge were acquitted (well above the national average); for those who stuck with a jury trial, the figure was just 51 percent.
In interviews, the judges themselves seemed surprised at the results—they simply did not realize how tilted they were against the prosecution. It took an outside monitor pulling together all of the data to reveal the slant.
The journalists' work prompted the Supreme Judicial Court to commission its own year-long study of the problem, which made specific recommendations to help reduce the acquittal rate and restrict the ability of defense attorneys to steer their clients to the most favorable judges, among other things.
These ongoing reform efforts hold the potential not only to increase fairness but also to save lives.

That said, with the decline of print media—the traditional bastion of serious investigative journalism—and constraints on university research funding, the judiciary itself ought to commit to better recordkeeping aimed specifically at uncovering hidden biases.
If the Massachusetts Trial Court had collected data on conviction rates for bench and jury trials, it might have noticed the problem years earlier.

On an individual level, psychological research suggests that judges could also benefit from
self
-monitoring by learning about the biases that influence their behavior, expecting (and accepting) that they are not immune, and then taking stock of how they actually behave. The judiciary could help judges with this process not only by providing training on relevant psychological dynamics (a few seminars have already been offered at the federal level on implicit bias) but also by providing individualized statistics.
Judges receive surprisingly little feedback on their decisionmaking—lawyers rarely offer it, and the appellate review process rarely yields any meaningful information on cognitive biases or errors. How does a judge know, for example, whether race, gender, or age impact her treatment of defendants, or whether the harsh sentences she hands down are effective? Judges usually make calls and move on.
But seeing the data could be a powerful antidote.

A judge is always going to have hunches about a case—and when those hunches reflect years of experience, they can be valuable.
But since they can also lead to errors, our intuitions need to be carefully examined.

That's equally true for police officers, lawyers, jurors, and witnesses:
we need to get all of our key legal actors in the business of second-guessing themselves. That sounds strange, but doubt isn't the enemy of justice—blind certainty is. And, in most cases, healthy skepticism isn't going to develop on its own because there are so many forces pulling the other way.

Former New York supreme court judge Frank Barbaro stands as a prime example of how to interrogate one's instincts and decisions.
Everything is stacked against it, but Barbaro found a way: “I had a practice…that whenever I made a legal decision, I never let it lie. I ran it through my mind again.” There was one case in particular that he kept mulling over.
In October 1999, the defendant, Donald Kagan, had waived his right to a jury trial, putting his fate entirely in Barbaro's hands.
Kagan claimed that he had acted in self-defense when he shot and killed Wavell Wint outside a Brooklyn movie theater.
But Judge Barbaro convicted him of second-degree murder and a weapons charge and sentenced him to fifteen years to life.

Although it had been more than a decade, Barbaro told his wife, Patti, “I really feel I need to revisit this case. I need to get the transcripts. I don't feel comfortable with this. It's been haunting me.”
And when he pored over the record anew, he was “absolutely horrified”: “It was so obvious I had made a mistake. I got sick. Physically sick.”

He realized that his own background and experience as a vigorous civil rights advocate had colored his treatment of the case: “When the trial began, I was absolutely convinced that Donald Kagan [who was white] was a racist and was out looking for trouble and fully intended to kill Mr. Wint [who was black].”
That frame had caused him to overlook evidence that suggested that Kagan acted in self-defense.

Revisiting the facts, Wint now appeared to have been the aggressor.
It seemed Kagan had shown his gun only to ward off a drunken Wint, who'd tried to rob Kagan of his gold chain.
Wint's friends had dragged him away, but he'd fought them off and gone
right back into Kagan's face.
When Kagan pulled his gun a second time, Wint went for it.
In the scuffle, the gun fired into Wint's torso.
In December 2013, fourteen years after Kagan had been convicted, Judge Barbaro took the witness stand to claim that his own verdict should be overturned: “I believe now that I was seeing this young white fellow as a bigot, as someone who assassinated an African American….I was prejudiced during the trial.”

Other books

One Way or Another by Rhonda Bowen
Tag, You're It! by Penny McCall
Stepbrother's Gift by Krista Lakes
Cam Jansen and the Joke House Mystery by David A. Adler, Joy Allen
Beneath a Winter Moon by Shawson M Hebert