This list is a realization that I have read many books, and hold many strong opinions. I have decided to write everything down to organize my thoughts. Its not alphabetical order because im lazy. This list is constantly updating.

What I’ve read

Recursive Function Theory & Logic (1971)

Ann Yasuhara

I wanted to find this book for exactly one reason. Post’s correspondence problem (PCP) is a famous example of an undecidable problem which has nothing to do with Turing machines. The original proof by Post is vastly different than the proof given by Sipser, and I wanted to see how this developed. There is manuscript by Floyd called New Proofs of old Theorems which was unaccesible to me. This book cites that paper, and references its proof of the PCP theorem to it. It is very similar in spirit to the proof given by Sipser, but it requires development of semi-Thue systems, and the unsolvability of that word problem.

Collected Fictions

Jorge Luis Borges

Like Black Mirror for computer scientists or something.

Complexity and Real Computation (1998)

Lenore Blum, Felipe Cucker, Michael Shub, Steve Smale

This book is a generalization of classical computation. Here, instead of symbols from a finite alphabet, registers can hold arbitrary rational numbers. What we get is a computational model for uncountable sets, and an entrypoint to discussion of their decidability and complexity, with respect to this model. There is no analogous version of the Church Turing thesis, but this is expected. It is unfortunate that some of the results cannot be self contained. For example, they show that the Mandelbrot set is undecidable, but the proof requires an advanced digression. Discrete mathematicians do not have many tools, combinatorics, logic, some trivial parts of abstract algebra. I can see the motivation. Continuous mathematics has centuries of techniques and tools. It would be hopeful if somehow, using these tools to answer might say something about , but I am not confident that will be the case.

Introduction to the Theory of Computation (1997, 2013)

Michael Sipser

Where do I even start where this book. I feel like I could write an entire book, just about the way this book presents material. This is an undergraduate level book for automata and complexity theory, and the recommended text for probably every course in the country. When I first took a course under this book, I wasn’t fond of it. I was a math undergrad, and I felt like it was holding my hand too much. Only after I became a TA did I really get to respect its ingenuity. Over the past century, computer science has gone from a subfield of mathematics to an engineering displine. People earning computer science degrees are going out there and building the world, they don’t have time for frivolous activities. The book targets these people, arguing that theory is important. Its not directed to people who already respect the power of theory. The knowledge of computability theory has been perfectly distilled in three ways. First, many theorems traditionally represented in recursive function theory are rewritten and reproved in Turing machine syntax. Students have a natural, intuitive understanding of the power of computer programs, so this makes results more obvious. There is no need to get expository about Godel numberings because everyone knows what code is. Second is the omission of certain theorems. Rice’s theorem, and the Myhill-Nerode theorem are important, but are notably absent from the book. As a TA, I found that these are very powerful tools, which are open to a large amount of misuse. On exams, if a student tried to use Myhill-Nerode instead of the pumping lemma, it was a sign they didn’t understand how to use pumping, and would get the question totally wrong. Similarly for Rice’s theorem in place of reductions. Finally, the naming of theorems. Notably absent is the description of theorems. Kleene’s recursion theorem is just “the recursion theorem”. Kolmogorov complexity is just “descriptive complexity”. The Baker-Gill-Solovay result is just theorem 9.20. Naming theorems is a useful tool. If you say “Baker-Gill-Solovay” to someone in the same language domain, they know exactly what you are talking about. This makes the book not as well of a reference, but more self contained, like a movie. In fact, the book contains only a single inline citation. This doesn’t make the book bad, not even close. Every proof is the clearest and simplest version of the proof I have ever seen. I have probably read the book cover to cover 20 times. I own two hardcover copies, one was a gift.

Elements of the Theory of Computation (1981)

Harry R. Lewis, Christos H. Papadimitriou

You know how a directors cut of a movie sometimes isn’t better or worse, just different? Thats the best analogy I could make with this book. The Sipser book feels almost like a subset of this book, especially the early parts. The notation is less clear, but there is more thoroughness. Two specific details I would like to share are a problem about regular languages and the proof of the Church-Turing Thesis. I have looked for a characterization of the unary regular languages and this book provides it as a problem: A unary language is regular if it is the finite union of unary languages whos lengths are arithmetic progressions. For the proof of the CTT, their argument is as follows. Let be the languages decidable by the computational model . Through simulation, they show where is Turing machines, is unrestricted grammars, is the recursive functions, the primitive recursive functions plus unbounded minimization. It depends who you are, but such a proof can be more convincing for the CTT, if you don’t have the intuition. Would you believe me if I said I decided to try reading this book based on the stylish font of the cover?

Topics in Algebra

I.N. Herstein

A classic. When he was like “heres three proofs of Sylows theorem”, I was like “oh yeah baby”.

Alfred Tarski, Life and Logic (2004)

Anita Burdman Feferman, Solomon Feferman

A Biography of Alfred Tarski, with some mathematical digressions into his work. The largest takeway for me actually was the sheer amount of drugs Tarski consumed on a daily basis. I have tried to extract his routine. He would normally wake up around 10AM, after staying up all night. He would start the day with speed, more commonly known as meth. Throughout the day he would consume large quantities of cigarettes, cigarillos, coffee, benezedrine, and Kola Astier, a parisian coffee/cocaine mixture. He would invite a grad student to work through the night with him at his home. They would start at 9PM, and work tirelessly, stopping for coffee around 2AM. He would then sleep at around 4:30AM. The other personal details seem to pale in comparison to this.

Formal Languages and their relation to Automata (1969)

John Hopcroft, Jeffrey Ullman

This is the classic book everyone talks about. I think there are version differences, but the one I have is the oldest. I have grown to appreciate it more than I originally did.

Theory of Recursive Functions and Effective Computability (1967)

Hartley Rogers Jr.

I could say a lot about this book, but its defining character is not the theorems and proofs, but rather the perspectives, and human words. I found the explanations of things to really help with understanding what is actually very challenging by nature. The style of proof is also of an, older accent, which I found interesting.

Logicomix, An Epic Search for Truth (2009)

Apostolos Doxiadis, Christos H. Papadimitriou

This is a graphic novel of the life of Bertrand Russell and his contemporaries. Ive always thought of Godels end of Hillberts program to be very striking, and deserving of dramatization. I would love to be able to assign this as homework or something.

Computability and Unsolvability (1958)

Martin Davis

This was the first book I ended up looking at when I wanted to learn more advanced topics in computability theory. I remember working through it to be very difficult, especially since the notation can be abusive. It was written before Hillbert’s 10th problem was solved, so some of the material is motivated by this. There is focus on connections between recursively enumerable sets and diophantine predicates. The following quote will forever stay with me. “…Thus from the point of view of recursive-function theory, there is no essential distinction between singulary and n-ary (n > 1) functions. It is as though, in analysis, one possessed analytic homeomorphisms between one-dimensional and n-dimensional euclidean space.”

A Brief History of Time (1988)

Stephen Hawking

I don’t really know much about physics anymore, besides quantum computing, but thats not really the same. This book, written more for the layman, was still insightful, especially chapter 9.

Computational Complexity: A Modern Approach

Sanjeev Arora, Boaz Barak

First off, what is complexity theory? To me, it is about the deep mathematical relations between complexity classes. Complexity theory should be taught as a set of tools and techniques. The history of complexity theory is these tools failing against $\mathsf{P}=\mathsf{NP}$, but complexity theory has more tools and techniques than any other mathematical field I can think of. As a contrast, computability theory really is just diagonalization and reduction. Number theory is what? induction and Hensel lifting? Continuing with that analogy, complexity theory is a toolbox of a thousand specialized hammers. I have the unpopular opinion that this book is, just okay. It doesn’t do an amazing job revealing the deep mathematical relations you might seek. It doesn’t do an amazing job giving you an advanced set of tools either. For example, take the early chapter on diagonalization, and Ladner’s theorem. The fast version of Ladner’s theorem is that if then there exist languages in which aren’t complete. There are two common proofs of Ladner’s theorem, one is by padding SAT, and the other is by “blowing holes” in SAT. The padding one gives you just enough to show the fast version. However, there is a stronger version of Ladner’s theorem, which says such that . The book mostly cares about showing that NP intermediate languages exist. Again, with the toolbox analogy, the harder proof of Ladner’s theorem is not just about the result, but the style of proof and technique it provides. This stronger version of Ladner’s theorem also reveals something much deeper and more interesting, that polytime reducibility forms a dense and partial ordering of the computable languages not in . This is just one of many examples where richness becomes bland. Some good things I can say about the book are that its relatively new, from this milennia, and that it has incredible breadth.

Hilbert’s 10th Problem

Yuri Matiyasevich

This book is the self contained proof of the negative answer to Hillbert’s 10th problem. I began reading it in hopes I could present a proof outline in one or two lectures. That is unfortunately not the case. This is a much simpler proof than the sequence of original papers, but its still very hard. Chapter two required an insane amount of attention and coffee. This was an experience for me to learn the hard way, sometimes the simplest version of something still isn’t simple enough. Some results just have to be long and complex.

Mathematics and Computation

Avi Wigderson

Its as if he read someone elses complexity book, and then provided an entire book of comments and advice. Its openly proud that it contains no serious proofs, its not a textbook. It is incredibly entertaining to read, and hard to put down. The presentation is almost theatrical. I annoyingly recommend this to everyone.

Currently Reading

The Man Who Loved Only Numbers

Paul Hoffman

The Structure of Scientific Revolutions

Thomas S. Kuhn.

Abel’s proof

Peter Pesic

Handbook of Theoretical Computer Science. Volume A Algorithms and Complexity.

Several Authors

A New Kind of Science

Steven Wolfram