up

Reading List

The cover from the book: Structure and Interpretation of Computer Programs

Structure and Interpretation of Computer Programs

May 10, 2017

This book is definitely one of my biggest influences as a coder. It's amazing on so many levels, but what stands out the most for me is its emphasis on design thinking.

After reading this classic, the way I see things now is the general trend of a person's growth as a programmer is in two stages: First they learn a given language or two. If they're working on challenging projects they will learn the subtleties of those language's grammars, along with many techniques and concepts to solve many specific problems. That's the first stage. The second stage is once they're comfortable with reading and writing in a given language (once they're literate), they then move past specific languages and instead start learning about larger design thinking, paradigms for organizing code, patterns that transcend.

With that said, I recommend this as a must read to anyone who wants to know more about the linguistics of programming languages in general. As well, if you want to improve your range of general design thinking.

Consisting of five rich chapters, the wealth of knowledge here can be broken down by its main theme: The philosophy of ideas, where an idea is constructed as

  1. primitive expressions
  2. means of combination
  3. means of abstraction

In particular Chapter 1 focuses on primitive expressions, Chapter 2 on means of combination, while the remaining Chapters 3, 4, 5 can be said to discuss means of abstraction.

Chapter 1 - Building Abstractions with Procedures

The language scheme is used throughout this book, which is in the lisp family of languages, and is therefore functional in its paradigm of computational programming. In the functional paradigm everything can be thought of as functions, so naturally this introductory chapter focuses on the basics of computational functions, which it calls procedures.

This chapter explores the range of what a procedure is, ending with ideas of lambdas as anonymous references. This chapter also creates the transition to the next chapter by introducing how to combine procedures by using them as first class citizens within arguments as well as return values.

Chapter 2 - Building Abstractions with Data

With the basic means of combination now available to us, this chapter focuses on how to combine larger and broader patterns of procedures. In particular this is where we learn to mitigate the complexity of design through ideas of modularization and additivity.

What I found most fascinating personally were the ideas of abstraction barriers, and how they could be used along with constructors and selectors to effectively define a type. Or maybe type discovery is a better phrase? They don't specifically rely on type theory, instead implementing them through what they call "tagged data".

The highlight of this chapter is the modularizing dispatch paradigm. In particular it diverges in two directions when extending to include additivity: data directed programming as well as message passing. I admit my own intellectual boundaries were being pushed in realizing that data direction was more suitable in contexts where operators were more stable and types more volatile, whereas message passing was the opposite—being more suitable when types were more stable but operators more volatile.

Chapter 3 - Modularity, Objects, and State

Ideas of state as well as assignment are introduced here, leading to some very interesting philosophical considerations regarding the nature of identity. Beyond this, the stream paradigm is introduced through the concepts of delay and force.

In a way this chapter is independent of the main theme, but it's also a pivot point. By implementing examples of particular variant paradigms within this language, it offers fodder for the remaining two chapters.

Chapter 4 - Metalinguistic Abstraction

By playing with and loosening the ideas of a model of evaluation, the previous chapter allows us to transition to the idea of using one variant of the language to simulate other variants of the same language. It's one thing to ad-hoc implement variant language features, but how do we scale? This book is all about design thinking after all. Simulating the evaluator of the language itself is the means to scale. It allows a systematic way to prototype other variants.

The discussion on lazy evaluation I found most valuable, as this paradigm shows up in languages like Python or R for example, and helps me to understand them on a deeper level.

Chapter 5 - Computing with Register Machines

One of the nicest aspects of this book is it ends as strong as it begins. The first four chapters were overwhelmingly beautiful, and this one did not let down. The transition to register machines to reimplement the simulator, the metacircular evaluator while showing exactly how the feature of tail recursion works—and is an implementation specific feature—blew my mind.

For me, the other valuable part of this chapter was showing how the interpreter worked, and from the perspective offered here compilers and interpreters only differ in their final step—whether or not to execute the expression or to translate it to machine code. For anyone looking to build a compiler, this offers a very nice and soft introduction as to how.