Structure and Interpretation of Computer Programs (3 page)

Read Structure and Interpretation of Computer Programs Online

Authors: Harold Abelson and Gerald Jay Sussman with Julie Sussman

BOOK: Structure and Interpretation of Computer Programs
8.66Mb size Format: txt, pdf, ePub

Al Moyé arranged for us to teach this material to engineers at
Hewlett-Packard, and for the production of videotapes of these
lectures.
We would like to thank the talented instructors – in
particular Jim Miller, Bill Siebert, and Mike Eisenberg – who have
designed continuing education courses incorporating these tapes and
taught them at universities and industry all over the world.

Many educators in other countries have put in significant
work translating the first edition.
Michel Briand, Pierre Chamard, and André Pic produced a French edition;
Susanne Daniels-Herold produced a German
edition; and Fumio Motoyoshi produced a Japanese edition.
We do not know who produced the Chinese edition,
but we consider it an honor to have been selected as the
subject of an “unauthorized” translation.

It is hard to enumerate all the people who have made technical
contributions to the development of the Scheme systems we use for
instructional purposes. In addition to Guy Steele, principal wizards
have included Chris Hanson, Joe Bowbeer, Jim Miller, Guillermo Rozas,
and Stephen Adams. Others who have put in significant time are
Richard Stallman, Alan Bawden, Kent Pitman, Jon Taft, Neil Mayle, John
Lamping, Gwyn Osnos, Tracy Larrabee, George Carrette, Soma
Chaudhuri, Bill Chiarchiaro, Steven Kirsch, Leigh Klotz, Wayne Noss,
Todd Cass, Patrick O'Donnell, Kevin Theobald, Daniel Weise, Kenneth
Sinclair, Anthony Courtemanche, Henry M. Wu, Andrew Berlin, and Ruth
Shyu.

Beyond the MIT implementation, we would like to thank the many people
who worked on the IEEE Scheme standard, including William Clinger and
Jonathan Rees, who edited the R
4
RS, and Chris Haynes, David
Bartley, Chris Hanson, and Jim Miller, who prepared the IEEE standard.

Dan Friedman has been a long-time leader of the Scheme community.
The community's broader work goes beyond issues of language design to
encompass significant educational innovations, such as the high-school
curriculum based on EdScheme by Schemer's Inc., and the wonderful
books by Mike Eisenberg and by Brian Harvey and Matthew Wright.

We appreciate the work of those who contributed to making this a real
book, especially Terry Ehling, Larry Cohen, and Paul Bethge at the MIT
Press. Ella Mazel found the wonderful cover image. For the second
edition we are particularly grateful to Bernard and Ella Mazel for
help with the book design, and to David Jones, T
E
X wizard
extraordinaire. We also are indebted to those readers who made
penetrating comments on the new draft: Jacob Katzenelson, Hardy
Mayer, Jim Miller, and especially Brian Harvey, who did unto this book
as Julie did unto his book
Simply Scheme
.

Finally, we would like to acknowledge the support of the organizations that
have encouraged this work over the years, including support from Hewlett-Packard,
made possible by Ira Goldstein and Joel Birnbaum, and support from DARPA, made
possible by Bob Kahn.

Building Abstractions with Procedures

The acts of the mind, wherein it exerts its power over simple ideas,
are chiefly these three: 1. Combining several simple ideas into one
compound one, and thus all complex ideas are made. 2. The second is
bringing two ideas, whether simple or complex, together, and setting
them by one another so as to take a view of them at once, without
uniting them into one, by which it gets all its ideas of relations.
3. The third is separating them from all other ideas that accompany
them in their real existence: this is called abstraction, and thus all
its general ideas are made.

John Locke,
An Essay Concerning Human Understanding
(1690)

We are about to study the idea of a
computational process
.
Computational processes are abstract beings that inhabit computers.
As they evolve, processes manipulate other abstract things called
data
. The evolution of a process is directed by a pattern of rules
called a
program
. People create programs to direct processes.
In effect, we conjure the spirits of the computer with our spells.

A computational process is indeed much like a sorcerer's idea of a
spirit. It cannot be seen or touched. It is not composed of matter
at all. However, it is very real. It can perform intellectual work.
It can answer questions. It can affect the world by disbursing money
at a bank or by controlling a robot arm in a factory. The programs we
use to conjure processes are like a sorcerer's spells. They are
carefully composed from symbolic expressions in arcane and esoteric
programming languages
that prescribe the tasks we want our
processes to perform.

A computational process, in a correctly working computer, executes
programs precisely and accurately. Thus, like the sorcerer's
apprentice, novice programmers must learn to understand and to
anticipate the consequences of their conjuring. Even small errors
(usually called
bugs
or
glitches
) in programs can have
complex and unanticipated consequences.

Fortunately, learning to program is considerably less dangerous than
learning sorcery, because the spirits we deal with are conveniently
contained in a secure way. Real-world programming, however,
requires care, expertise, and wisdom. A small bug in a computer-aided
design program, for example, can lead to the catastrophic collapse of
an airplane or a dam or the self-destruction of an industrial robot.

Master software engineers have the ability to organize programs so
that they can be reasonably sure that the resulting processes will
perform the tasks intended. They can visualize the behavior of their
systems in advance. They know how to structure programs so that
unanticipated problems do not lead to catastrophic consequences, and
when problems do arise, they can
debug
their programs. Well-designed
computational systems, like well-designed automobiles or nuclear
reactors, are designed in a modular manner, so that the parts can be
constructed, replaced, and debugged separately.

Programming in Lisp

We need an appropriate language for describing processes, and we will
use for this purpose the programming language Lisp. Just as our
everyday thoughts are usually expressed in our natural language (such
as English, French, or Japanese), and descriptions of quantitative
phenomena are expressed with mathematical notations, our procedural
thoughts will be expressed in Lisp.
Lisp was invented in the late
1950s as a formalism for reasoning about the use of certain kinds of
logical expressions, called
recursion equations
, as a model for
computation. The language was conceived by
John McCarthy and is based
on his paper “Recursive Functions of Symbolic Expressions and Their
Computation by Machine” (McCarthy 1960).

Despite its inception as a mathematical formalism, Lisp is a practical
programming language. A Lisp
interpreter
is a machine that
carries out processes described in the Lisp language. The first Lisp
interpreter was implemented by
McCarthy with the help of colleagues
and students in the Artificial Intelligence Group of the
MIT Research
Laboratory of Electronics and in the MIT Computation
Center.
1
Lisp, whose name is an acronym for LISt Processing,
was designed to provide symbol-manipulating capabilities for
attacking programming problems such as the symbolic differentiation
and integration of algebraic expressions. It included for this
purpose new data objects known as atoms and lists, which most
strikingly set it apart from all other languages of the period.

Lisp was not the product of a concerted design effort. Instead, it
evolved informally in an experimental manner in response to users'
needs and to pragmatic implementation considerations. Lisp's informal
evolution has continued through the years, and the community of Lisp
users has traditionally resisted attempts to promulgate any
“official” definition of the language. This evolution, together
with the flexibility and elegance of the initial conception, has
enabled Lisp, which is the second oldest language in widespread use
today (only
Fortran is older), to continually adapt to encompass the
most modern ideas about program design. Thus, Lisp is by now a family
of dialects, which, while sharing most of the original features, may
differ from one another in significant ways. The dialect of Lisp used
in this book is called
Scheme.
2

Because of its experimental character and its emphasis on symbol
manipulation,
Lisp was at first very inefficient for numerical
computations, at least in comparison with Fortran. Over the years,
however, Lisp compilers have been developed that translate programs
into machine code that can perform numerical computations reasonably
efficiently. And for special applications, Lisp has been used with
great effectiveness.
3
Although Lisp has not yet overcome its old reputation
as hopelessly inefficient, Lisp is now used in many applications where
efficiency is not the central concern. For example, Lisp has become
a language of choice for operating-system shell languages and for
extension languages for editors and computer-aided design systems.

If Lisp is not a mainstream language, why are we using it as the
framework for our discussion of programming? Because the language
possesses
unique features that make it an excellent medium for
studying important programming constructs and data structures and for
relating them to the linguistic features that support them. The most
significant of these features is the fact that Lisp descriptions of
processes, called
procedures
, can
themselves be represented and manipulated as Lisp data. The
importance of this is that there are powerful program-design
techniques that rely on the ability to blur the traditional
distinction between “passive” data and “active” processes. As we
shall discover, Lisp's flexibility in handling procedures as data
makes it one of the most convenient languages in existence for
exploring these techniques. The ability to represent procedures as
data also makes Lisp an excellent language for writing programs that
must manipulate other programs as data, such as the interpreters and
compilers that support computer languages. Above and beyond these
considerations, programming in Lisp is great fun.

1
The
Lisp 1 Programmer's Manual
appeared in
1960, and the
Lisp 1.5 Programmer's Manual
(McCarthy 1965)
was published in 1962. The early history of Lisp is described in
McCarthy 1978.

2
The two dialects in which most
major Lisp programs of the 1970s were written are
MacLisp
(Moon 1978;
Pitman 1983), developed at the
MIT Project MAC, and
Interlisp
(Teitelman 1974), developed at
Bolt Beranek and Newman Inc. and the
Xerox Palo Alto Research Center.
Portable Standard Lisp
(Hearn 1969;
Griss 1981) was a Lisp dialect designed to be easily portable
between different machines. MacLisp spawned a number of subdialects,
such as
Franz Lisp, which was developed at the
University of
California at Berkeley, and
Zetalisp (Moon 1981), which was based on a
special-purpose processor designed at the
MIT Artificial Intelligence
Laboratory to run Lisp very efficiently. The Lisp dialect used in
this book, called
Scheme (Steele 1975), was invented in 1975 by
Guy
Lewis Steele Jr. and Gerald Jay Sussman of the MIT Artificial
Intelligence Laboratory and later reimplemented for instructional use
at MIT. Scheme became an IEEE standard in 1990 (IEEE 1990). The
Common Lisp dialect (Steele 1982, Steele 1990) was developed by the
Lisp community to combine features from the earlier Lisp dialects
to make an industrial standard for Lisp. Common Lisp became an ANSI
standard in 1994 (ANSI 1994).

3
One such special application was a
breakthrough computation of scientific importance – an integration of
the motion of the
Solar System that extended previous results by
nearly two orders of magnitude, and demonstrated that the dynamics of
the Solar System is chaotic. This computation was made possible by
new integration algorithms, a special-purpose compiler, and a
special-purpose computer all implemented with the aid of software
tools written in Lisp
(Abelson et al. 1992;
Sussman and
Wisdom 1992).

1.1  The Elements of Programming

A powerful programming language is more than just a means for
instructing a computer to perform tasks. The language also serves as
a framework within which we organize our ideas about processes. Thus,
when we describe a language, we should pay particular attention to the
means that the language provides for combining simple ideas to form
more complex ideas. Every powerful language has three mechanisms for
accomplishing this:

  • primitive expressions
    , which represent the simplest
    entities the language is concerned with,
  • means of combination
    , by which compound
    elements are built from simpler ones, and
  • means of abstraction
    , by
    which compound elements can be named and manipulated as units.

In programming, we deal with two kinds of elements:
procedures and
data. (Later we will discover that they are really not so distinct.)
Informally, data is “stuff” that we want to manipulate, and
procedures are descriptions of the rules for manipulating the data.
Thus, any powerful programming language should be able to describe
primitive data and primitive procedures and should have methods for
combining and abstracting procedures and data.

Other books

Machines of the Dead 3 by David Bernstein
The Woman Next Door by Barbara Delinsky
The Ranch by Jane Majic
Kolyma Tales by Varlam Shalamov,
A Song For Lisa by Clifton La Bree
Some Like It Witchy by Heather Blake
The Only Victor by Alexander Kent
Clouds In My Coffee by Andrea Smith
Wanting by Sarah Masters