True Names and the Opening of the Cyberspace Frontier (12 page)

BOOK: True Names and the Opening of the Cyberspace Frontier
12.55Mb size Format: txt, pdf, ePub
ads

In many ways this is similar to safety engineering. Safety is another engineering requirement that isn't simply a “feature.” But safety engineering involves making sure things do not fail in the presence of random faults: it's about programming Murphy's computer, if you will. Security engineering involves making sure things do not fail in the presence of an intelligent and malicious adversary who forces faults at precisely the worst time and in precisely the worst way. Security engineering involves programming Satan's computer.

And Satan's computer is hard to test.

Virtually all software is developed using a “try-and-fix” methodology. Small pieces are implemented, tested, fixed, and tested again. Several of these small pieces are combined into a module, and this module is then tested, fixed, and tested again. Small modules are then combined into larger modules, and so on. The end result is software that more or less functions as expected, although in complex systems bugs always slip through.

This try-and-fix methodology just doesn't work for testing security. No amount of functional testing can ever uncover a security flaw, so the testing process won't catch anything. Remember that security has nothing to do with functionality. If you have an encrypted phone, you can test it. You can make and receive calls. You can try, and fail, to eavesdrop. But you have no idea if the phone is secure or not.

The only reasonable way to “test” security is to perform security reviews. This is an expensive, time-consuming, manual process. It's not enough to look at the security protocols and the encryption algorithms. A review must cover specification, design, implementation, source code, operations, and so forth. And just as functional testing cannot prove the absence of bugs, a security review cannot show that the product is in fact secure.

It gets worse. A security review of version 1.0 says little about the security of version 1.1. A security review of a software product in isolation does not necessarily apply to the same product in an operational environment. And the more complex the system is, the harder a security evaluation becomes and the more security bugs there will be.

Suppose a software product is developed without any functional testing at all. No alpha or beta testing. Write the code, compile it, and ship. The odds of this program working at all -- let alone being bug-free -- are zero. As the complexity of the product increases, so will the number of bugs. Everyone knows testing is essential.

Unfortunately, this is the current state of practice in security. Products are being shipped without any, or with minimal, security testing. I am not surprised that security bugs show up again and again. I can't believe anyone expects otherwise.

Even worse, products are getting more complex every year: larger operating systems, more features, more interactions between different programs on the Internet. Windows NT has been around for a few years, and security bugs are still being discovered. Expect many times more bugs in Windows 2000; the code is significantly larger. Expect the same thing to hold true for every other piece of software.

This won't change. Computer usage, the Internet, convergence, are all happening at an ever-increasing pace. Systems are getting more complex, and necessarily more insecure, faster than we can fix them -- and faster than we can learn how to fix them.

Acknowledgements: The phrase “programming Satan's computer” was originally Ross Anderson's. It's just too good not to use, though.

How Is the NII Like a Prison?

Alan Wexelblat

When using the Internet we often forget that we're not alone. People chat online, enter “rooms” where they can be with others, but all the while there are aspects to the Internet that we for the most part ignore. Alan Wexelblat explains in cogent terms how the use of the panoptic sort (which he also kindly defines in his essay) can turn the Internet into a tool that can be used for functions completely different from those private citizens would like to see.

Whether the Internet is a prison or not is debatable, but the desires of large businesses to exploit the Internet (its formal name is the National Information Infrastructure) is undeniable. If you doubt business's ability or intent to exploit every possible advantage, I suggest you take a quick reality check … and—no pun intended—not at your local bank.

Alan Wexelblat, now PhD, was a researcher at the MIT Media Lab's Software Agents Group. He has returned to the commercial world, working for a small software company. This article was written in the mid-1990s.

 

 

 

The National Information Infrastructure is evolving on our screens. But behind the scenes another infrastructure is growing, one that threatens to turn the NII not into an information superhighway but into an information prison. Everyone has a different vision for the NII, from five hundred channels of consumer heaven to networked egalitarian communities. There are nearly as many models for the NII as there are writers interested in the topic.

Regardless of which model holds, however, it seems clear that the NII will be a primary mechanism for the transaction of business between companies and customers and between government and citizens. A recent book,
The Panoptic Sort: A Political Economy of Personal Information,
by Oscar Gandy, attempts to paint a picture of an emerging phenomenon that affects how these transactions will be carried out.

This mechanism, which he calls the panoptic sort, describes an information collection and use regime that severely impacts on the privacy of, and opportunities afforded to, people in our late capitalist culture. The panoptic sort is a set of practices by government and especially by companies whereby information is gathered from people through their transactions with the commercial system. The information is then exchanged, collated, sold, compared, and subject to extensive statistical analyses.

As Gandy describes it:
The panoptic sort is the name I have assigned to the complex technology that involves the collection, processing, and sharing of information about individuals and groups that is generated through their daily lives as citizens, employees, and consumers and is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy. The panoptic sort is a system of disciplinary surveillance that is widespread but continues to expand its reach.

The goal of these activities is to enable information-holders to make predictions about the behavior of the people on whom the information was collected. The ultimate goal is to be able to sort all the people the company comes in contact with along whatever dimension of information is desired:

• How likely is this person to pay his charge bill?

• How likely is this person to become pregnant at some point in her work career?

• Does this family qualify for food stamps?

The essential element of the panoptic sort is the transaction. People, for the purpose of the sort, exist only in discrete interactions, when some exchange is made for goods or services. The prototypical transaction is the application, where the person exchanges detailed information in exchange for potential access (to a job, to medical care, etc.). People are usually not permitted to withhold information from a transaction. For example, credit card applications (even so-called preapproved ones) will not be processed unless the applicant provides a Social Security number (SSN). Similarly, the government now requires all children above the age of two to have an SSN if their names appear on any bank accounts or tangible assets.

In order to make discriminations such as the ones above, the decision makers need complete information. Thus, the term panoptic, or all-seeing. Gandy draws the term from its earlier use by Jeremy Bentham, an English prison reformer of the nineteenth century. Bentham proposed constructing prisons in the form of something he called a Panopticon. In this model, prisoners would be held in cells with glass doors arranged around a ring. At the center of the ring would be the guard tower. Important to Bentham's design was that the prisoners were isolated from each other and could not see each other, nor could they see the guards. The guards in the tower, however, could see all the prisoners without the inmates knowing they were being watched.

Gandy points out that the panoptic sort operates by essentially the same principles: our lives as consumers are opened up to scrutiny by arbitrary persons at any time for undisclosed purposes. We are atomized—treated as individual consumer-units unable to act collectively. At the same time we are prevented from knowing about the companies that observe us.

The panoptic sort also serves to extend control over unprecedented distances. Though the methods and techniques that are involved today have precedents and roots back to the beginnings of the industrial revolution, the technology in use now and in the near-NII future enables the extension of controls over global distances. Increasingly we find not just our workplaces but our homes invaded. The transit between home and work and our vacations also face intrusion. Part of this chapter was written on an airplane on which the flight steward announced that “your nightmare has come true: now you can be called in-flight.” Presumably we trust that the content of these calls will not be captured and analyzed for others' advantage the way the early telegrams were read by Western Union.

There are a number of consequences for people subjected to this sort of pervasive control and observation regime, not least of which is that we self-censor. People trained to expect denial (of services, credit, or opportunity) will soon cease applying for more. Subject to observation at any time by unknown persons with unpredictable means of retribution, we chill our own speech and action in ways antithetical to democracy. This process is already in evidence in America today. Chomsky has repeatedly pointed out that official censorship is not found in America because the speech is not particularly threatening to anyone in power.

Means of Operation

The panoptic sort operates by means of a three-step process: identification, classification, and assessment. Identification involves the association of persons, at the time of a transaction, with an existing file of information such as a credit or medical history. The panoptic sort not only requires us to submit increasingly detailed verifications of our identity, it requires the potential involvement of third parties merely to vouch for who/what we are; that is, our credit card companies vouch for us when we write a check, or the Department of Motor Vehicles when we buy a drink. Identification proceeds from a basis of complete distrust.

Identificative distrust has infiltrated our society to such an extent that we are all accustomed to being required to carry identificative tokens. Each of these tokens is the result of a transaction with the panopticon; each is granted to us in acknowledgment of our contribution of information to another file of information. Common “documentary tokens” (as Gandy calls them) include:

• Birth certificate

• Driver's license

• Social security card

This process of identification-via-token continues to expand. In reaction to mounting losses and falsifications associated with common tokens, new proposals are being made. The most successful of these so far is the ATM (automatic teller machine) or debit card. This card requires the user to enter a PIN (Personal Identification Number) and acts as a cash equivalent in many situations, though its online, real-time nature provides excellent data-gathering opportunities. Banks report losses through ATM/debit cards that are twenty to thirty times lower than losses associated with credit cards.

The next step in this process is currently under discussion. The technology involved is the “smart” card, so named because in addition to the ability to record information (on a magnetic strip or onboard computer memory) the card contains processing power to update the stored information and do computation with it in real time. Several proposals have been put forth recently to establish a national identification system around such smart cards.

In these systems, everyone would be required to carry a card that contained potentially vast amounts of personal information about the bearer's health, financial status, physical condition, residence, and so on. In addition, the cards' memory can be used to hold recent transactional information, such as the last
n
purchases made or the last
n
banking transactions. The card could also be programmed to do real-time identification of the holder, replacing PINs with some form of biometric analysis, such as voice identification or a fingerprint.

It is worth noting that in every case, the proposal is made in response to a supposed problem: illegal immigration, welfare “cheats,” national driver's licenses, access to personal medical information in an emergency. Invariably, the solution requires that we give up more of our privacy and personal information. Rather than fixing systemic causes, or looking rationally at whether these “cures” are worse than the problems they might solve, the operators of the panoptic sort use the publicity and fear associated with societal ills to expand their reach. The rational observer is left to wonder what information from his national ID card might be made available to whom and what information might be stored on the card without his knowledge.

Classification is “the assignment of individuals to conceptual groups on the basis of identifying information.” Classification is fundamentally about control. Since complete detailed information on everyone is impossible, companies use increasingly small “buckets” or groupings into which people can be classified. The assertion being made is that certain discernible information, such as income, number of children, marital status, and so on, can be used to assign people to a category such as “young, upwardly mobile professional” (the original classification that led to the term “yuppie” entering the public discourse). Once people have been assigned to such groupings, their behavior can be predicted by statistical techniques applied to the group as a whole.

BOOK: True Names and the Opening of the Cyberspace Frontier
12.55Mb size Format: txt, pdf, ePub
ads

Other books

Lagoon by Nnedi Okorafor
Serenity's Dream by Addams, Brita
Liberation Day by Andy McNab
Fake by D. Breeze
An Education by Lynn Barber