Structure and Interpretation of Computer Programs

Structure and Interpretation of Computer Programs Read Free

Book: Structure and Interpretation of Computer Programs Read Free
Author: Harold Abelson and Gerald Jay Sussman with Julie Sussman
Ads: Link
author.
    These skills are by no means unique to computer programming. The
techniques we teach and draw upon are common to all of engineering
design. We control complexity by building abstractions that hide
details when appropriate. We control complexity by establishing
conventional interfaces that enable us to construct systems by
combining standard, well-understood pieces in a “mix and match” way.
We control complexity by establishing new languages for describing a
design, each of which emphasizes particular aspects of the design and
deemphasizes others.
    Underlying our approach to this subject is our conviction that
“computer science” is not a science and that its significance has
little to do with computers. The computer revolution is a revolution
in the way we think and in the way we express what we think. The
essence of this change is the emergence of what might best be called
procedural epistemology
– the study of the structure of
knowledge from an imperative point of view, as opposed to the more
declarative point of view taken by classical mathematical subjects.
Mathematics provides a framework for dealing precisely with notions of
“what is.” Computation provides a framework for dealing precisely
with notions of “how to.”
    In teaching our material we use a dialect of the programming language
Lisp. We never formally teach the language, because we don't have to.
We just use it, and students pick it up in a few days. This is one
great advantage of Lisp-like languages: They have very few ways of
forming compound expressions, and almost no syntactic structure. All
of the formal properties can be covered in an hour, like the rules of
chess. After a short time we forget about syntactic details of the
language (because there are none) and get on with the real
issues – figuring out what we want to compute, how we will decompose
problems into manageable parts, and how we will work on the parts.
Another advantage of Lisp is that it supports (but does not enforce)
more of the large-scale strategies for modular decomposition of
programs than any other language we know. We can make procedural and
data abstractions, we can use higher-order functions to capture common
patterns of usage, we can model local state using assignment and data
mutation, we can link parts of a program with streams and delayed
evaluation, and we can easily implement embedded languages. All of
this is embedded in an interactive environment with excellent support
for incremental program design, construction, testing, and debugging.
We thank all the generations of Lisp wizards, starting with John
McCarthy, who have fashioned a fine tool of unprecedented power and
elegance.
    Scheme, the dialect of Lisp that we use, is an attempt to bring
together the power and elegance of Lisp and Algol. From Lisp we take
the metalinguistic power that derives from the simple syntax, the
uniform representation of programs as data objects, and the
garbage-collected heap-allocated data. From Algol we take lexical
scoping and block structure, which are gifts from the pioneers of
programming-language design who were on the Algol committee. We wish
to cite John Reynolds and Peter Landin for their insights into the
relationship of Church's lambda calculus to the structure of
programming languages. We also recognize our debt to the
mathematicians who scouted out this territory decades before computers
appeared on the scene. These pioneers include Alonzo Church, Barkley
Rosser, Stephen Kleene, and Haskell Curry.

 
    Acknowledgments
    We would like to thank the many people who have helped us develop this
book and this curriculum.
    Our subject is a clear intellectual descendant of “6.231,” a
wonderful subject on programming linguistics and the lambda calculus
taught at MIT in the late 1960s by Jack Wozencraft and Arthur Evans,
Jr.
    We owe a great debt to Robert Fano, who reorganized MIT's introductory
curriculum in electrical engineering and computer science to

Similar Books

Dancer

Emma Clark

Gorilla Beach

Nicole "Snooki" Polizzi

Never Ending

Kailin Gow

Deep Desires

Cathryn Fox

Rage

Jonathan Kellerman