Structure and Interpretation of Computer Programs (2 page)

Read Structure and Interpretation of Computer Programs Online

Authors: Harold Abelson and Gerald Jay Sussman with Julie Sussman

BOOK: Structure and Interpretation of Computer Programs
9.25Mb size Format: txt, pdf, ePub

As one would expect from its goals, artificial intelligence research
generates many significant programming problems. In other
programming cultures this spate of problems spawns new languages.
Indeed, in any very large programming task a useful organizing
principle is to control and isolate traffic within the task modules
via the invention of language. These languages tend to become less
primitive as one approaches the boundaries of the system where we
humans interact most often. As a result, such systems contain complex
language-processing functions replicated many times. Lisp has such a
simple syntax and semantics that parsing can be treated as an
elementary task. Thus parsing technology plays almost no role in Lisp
programs, and the construction of language processors is rarely an
impediment to the rate of growth and change of large Lisp systems.
Finally, it is this very simplicity of syntax and semantics that is
responsible for the burden and freedom borne by all Lisp programmers.
No Lisp program of any size beyond a few lines can be written without
being saturated with discretionary functions. Invent and fit; have
fits and reinvent! We toast the Lisp programmer who pens his thoughts
within nests of parentheses.

Alan J. Perlis
New Haven, Connecticut
 

Preface to the Second Edition

Is it possible that software is not like anything else, that it
is meant to be discarded: that the whole point is to
always see it as a soap bubble?

Alan J. Perlis

The material in this book has been the basis of MIT's entry-level
computer science subject since 1980. We had been teaching this
material for four years when the first edition was published, and
twelve more years have elapsed until the appearance of this second
edition. We are pleased that our work has been widely adopted and
incorporated into other texts. We have seen our students take the
ideas and programs in this book and build them in as the core of new
computer systems and languages. In literal realization of an ancient
Talmudic pun, our students have become our builders. We are lucky to
have such capable students and such accomplished builders.

In preparing this edition, we have incorporated hundreds of
clarifications suggested by our own teaching experience and the
comments of colleagues at MIT and elsewhere. We have redesigned
most of the major programming systems in the book, including
the generic-arithmetic system, the interpreters, the register-machine
simulator, and the compiler; and we have rewritten all the program
examples to ensure that any Scheme implementation conforming to
the IEEE Scheme standard (IEEE 1990) will be able to run the code.

This edition emphasizes several new themes. The most important
of these is the central role played by different approaches to
dealing with time in computational models: objects with state,
concurrent programming, functional programming, lazy evaluation,
and nondeterministic programming. We have included new sections on
concurrency and nondeterminism, and we have tried to integrate this
theme throughout the book.

The first edition of the book closely followed the syllabus of our MIT
one-semester subject. With all the new material in the second
edition, it will not be possible to cover everything in a single
semester, so the instructor will have to pick and choose. In our own
teaching, we sometimes skip the section on logic programming
(section 
4.4
), we have students use the
register-machine simulator but we do not cover its implementation
(section 
5.2
), and we give only a cursory overview of
the compiler (section 
5.5
). Even so, this is still
an intense course. Some instructors may wish to cover only the first
three or four chapters, leaving the other material for subsequent
courses.

The World-Wide-Web site
www-mitpress.mit.edu/sicp
provides support for users of this book.
This includes programs from the book,
sample programming assignments, supplementary materials,
and downloadable implementations of the Scheme dialect of Lisp.

 

Preface to the First Edition

A computer is like a violin. You can imagine a novice trying first a
phonograph and then a violin. The latter, he says, sounds terrible.
That is the argument we have heard from our humanists and most of our
computer scientists. Computer programs are good, they say, for
particular purposes, but they aren't flexible. Neither is a violin,
or a typewriter, until you learn how to use it.

Marvin Minsky, “Why Programming Is a Good
Medium for Expressing Poorly-Understood and Sloppily-Formulated
Ideas”

“The Structure and Interpretation of Computer Programs” is the
entry-level subject in computer science at the Massachusetts Institute
of Technology. It is required of all students at MIT who major
in electrical engineering or in computer science, as one-fourth of the
“common core curriculum,” which also includes two subjects on
circuits and linear systems and a subject on the design of digital
systems. We have been involved in the development of this subject
since 1978, and we have taught this material in its present form since
the fall of 1980 to between 600 and 700 students each year. Most of
these students have had little or no prior formal training in
computation, although many have played with computers a bit and a few
have had extensive programming or hardware-design experience.

Our design of this introductory computer-science subject reflects two
major concerns. First, we want to establish the idea that a computer
language is not just a way of getting a computer to perform operations
but rather that it is a novel formal medium for expressing ideas about
methodology. Thus, programs must be written for people to read, and
only incidentally for machines to execute. Second, we believe that
the essential material to be addressed by a subject at this level is
not the syntax of particular programming-language constructs, nor
clever algorithms for computing particular functions efficiently, nor
even the mathematical analysis of algorithms and the foundations of
computing, but rather the techniques used to control the intellectual
complexity of large software systems.

Our goal is that students who complete this subject should have a good
feel for the elements of style and the aesthetics of programming.
They should have command of the major techniques for controlling
complexity in a large system. They should be capable of reading a
50-page-long program, if it is written in an exemplary style. They
should know what not to read, and what they need not understand at any
moment. They should feel secure about modifying a program, retaining
the spirit and style of the original author.

These skills are by no means unique to computer programming. The
techniques we teach and draw upon are common to all of engineering
design. We control complexity by building abstractions that hide
details when appropriate. We control complexity by establishing
conventional interfaces that enable us to construct systems by
combining standard, well-understood pieces in a “mix and match” way.
We control complexity by establishing new languages for describing a
design, each of which emphasizes particular aspects of the design and
deemphasizes others.

Underlying our approach to this subject is our conviction that
“computer science” is not a science and that its significance has
little to do with computers. The computer revolution is a revolution
in the way we think and in the way we express what we think. The
essence of this change is the emergence of what might best be called
procedural epistemology
– the study of the structure of
knowledge from an imperative point of view, as opposed to the more
declarative point of view taken by classical mathematical subjects.
Mathematics provides a framework for dealing precisely with notions of
“what is.” Computation provides a framework for dealing precisely
with notions of “how to.”

In teaching our material we use a dialect of the programming language
Lisp. We never formally teach the language, because we don't have to.
We just use it, and students pick it up in a few days. This is one
great advantage of Lisp-like languages: They have very few ways of
forming compound expressions, and almost no syntactic structure. All
of the formal properties can be covered in an hour, like the rules of
chess. After a short time we forget about syntactic details of the
language (because there are none) and get on with the real
issues – figuring out what we want to compute, how we will decompose
problems into manageable parts, and how we will work on the parts.
Another advantage of Lisp is that it supports (but does not enforce)
more of the large-scale strategies for modular decomposition of
programs than any other language we know. We can make procedural and
data abstractions, we can use higher-order functions to capture common
patterns of usage, we can model local state using assignment and data
mutation, we can link parts of a program with streams and delayed
evaluation, and we can easily implement embedded languages. All of
this is embedded in an interactive environment with excellent support
for incremental program design, construction, testing, and debugging.
We thank all the generations of Lisp wizards, starting with John
McCarthy, who have fashioned a fine tool of unprecedented power and
elegance.

Scheme, the dialect of Lisp that we use, is an attempt to bring
together the power and elegance of Lisp and Algol. From Lisp we take
the metalinguistic power that derives from the simple syntax, the
uniform representation of programs as data objects, and the
garbage-collected heap-allocated data. From Algol we take lexical
scoping and block structure, which are gifts from the pioneers of
programming-language design who were on the Algol committee. We wish
to cite John Reynolds and Peter Landin for their insights into the
relationship of Church's lambda calculus to the structure of
programming languages. We also recognize our debt to the
mathematicians who scouted out this territory decades before computers
appeared on the scene. These pioneers include Alonzo Church, Barkley
Rosser, Stephen Kleene, and Haskell Curry.

 

Acknowledgments

We would like to thank the many people who have helped us develop this
book and this curriculum.

Our subject is a clear intellectual descendant of “6.231,” a
wonderful subject on programming linguistics and the lambda calculus
taught at MIT in the late 1960s by Jack Wozencraft and Arthur Evans,
Jr.

We owe a great debt to Robert Fano, who reorganized MIT's introductory
curriculum in electrical engineering and computer science to emphasize
the principles of engineering design. He led us in starting out on
this enterprise and wrote the first set of subject notes from which
this book evolved.

Much of the style and aesthetics of programming that we try to teach
were developed in conjunction with Guy Lewis Steele Jr., who
collaborated with Gerald Jay Sussman in the initial development of the
Scheme language. In addition, David Turner, Peter Henderson, Dan
Friedman, David Wise, and Will Clinger have taught us many of the
techniques of the functional programming community that appear in this
book.

Joel Moses taught us about structuring large systems. His experience
with the Macsyma system for symbolic computation provided the insight
that one should avoid complexities of control and concentrate on
organizing the data to reflect the real structure of the world being
modeled.

Marvin Minsky and Seymour Papert formed many of our attitudes about
programming and its place in our intellectual lives. To them we owe
the understanding that computation provides a means of expression for
exploring ideas that would otherwise be too complex to deal with
precisely. They emphasize that a student's ability to write and
modify programs provides a powerful medium in which exploring becomes
a natural activity.

We also strongly agree with Alan Perlis that programming is lots of
fun and we had better be careful to support the joy of programming.
Part of this joy derives from observing great masters at work. We are
fortunate to have been apprentice programmers at the feet of Bill
Gosper and Richard Greenblatt.

It is difficult to identify all the people who have contributed to the
development of our curriculum. We thank all the lecturers, recitation
instructors, and tutors who have worked with us over the past fifteen
years and put in many extra hours on our subject, especially Bill
Siebert, Albert Meyer, Joe Stoy, Randy Davis, Louis Braida, Eric
Grimson, Rod Brooks, Lynn Stein, and Peter Szolovits.
We would like to specially acknowledge the outstanding teaching
contributions of Franklyn Turbak, now at Wellesley; his work
in undergraduate instruction set a standard that we can
all aspire to.
We are grateful to Jerry Saltzer and Jim Miller for
helping us grapple with the mysteries of concurrency, and to
Peter Szolovits and David McAllester for their contributions
to the exposition of nondeterministic evaluation in chapter 4.

Many people have put in significant effort presenting this material at
other universities. Some of the people we have worked closely with
are Jacob Katzenelson at the Technion, Hardy Mayer at the University
of California at Irvine, Joe Stoy at Oxford, Elisha Sacks at Purdue,
and Jan Komorowski at the Norwegian University of Science and
Technology. We are exceptionally proud of our colleagues who have
received major teaching awards for their adaptations of this subject
at other universities, including Kenneth Yip at Yale, Brian Harvey at
the University of California at Berkeley, and Dan Huttenlocher at
Cornell.

Other books

Revenge by Joe Craig
Fima by Amos Oz
Maude Brown's Baby by Cunningham, Richard
Galilee Rising by Jennifer Harlow
Between Two Kings by Olivia Longueville
Whistler in the Dark by Kathleen Ernst
Cat on a Hot Tiled Roof by Anna Nicholas