- Meeting 1: Stable Matching. Section 0.5 (lecture 0, Section 0.5) of Jeff Erickson's notes. This discussion will help clarify the "prerequisites" for this course. Also read Section 0.6 for a sense of what this course is really about.
- Meeting 2: Completing the stable matching discussion. Reduction and Recursion (1.1 and 1.2), Towers of Hanoi (1.3), Mergesort (1.4), and Quicksort (1.5).
- Reading for this week: Relevant parts of lecture notes

- Meeting 1: Mergesort (1.4) and Quicksort (1.5)
- Meeting 2: Selection (1.7)

- Meeting 1: Multiplying 2 n-bit integers (1.8). Solving clean recurrences (Section 3 of appendix on solving recurrences).
- Meeting 2: Solving recurrences (Section 2 of appendix on solving recurrences). We saw how we could verify a guessed solution (obtained perhaps from a simplified clean recurrence) by induction. We touched on domain transformations (pages 17 and 18 of said appendix) that help reduce some nasty recurrences to clean ones.

- Meeting 1: Domain transformations. Backtracking --Subset sum (Section 3.3 in Lecture 3)
- Meeting 2: Backtracking -- subset sum and longest increasing sequences.
- Reading: Sections 3.1 and 3.2 -- we didn't cover these but they round out our understanding in a good way.

- Meeting 1: Dynamic programming -- Longest Increasing Subsequence (5.2) and Computing Edit distance (5.5)
- Meeting 2: Computing edit distance.
- Reading: Sections 5.1, 5.3, and 5.4 can be useful additional reading.

- Meeting 1: Dynamic Programming for Subset Sum, and max-weight independent set in a tree. The latter roughly corresponds to Section 5.7.
- Meeting 2: Randomized Algorithms -- review of probabilistic concepts.

- Meeting 1: The contention resolution protocol. Our discussion is based on Section 13.1 of the textbook "Algorithm Design" by Kleinberg and Tardos. Some notes are posted within the content link of the course's ICON page.
- Meeting 2: Randomized Minimum Cut, from Lecture 14 of Jeff Erickson's notes.

- Meeting 1: Completing the randomized min-cut, and reviewing for the midterm.
- Meeting 2: Midterm

- Meeting 1: Random variables and expectation. Examples: coin tosses, and the guessing games. We stressed the use of indicator random variables and linearity of expectation. Some notes within content tab in ICON.
- Meeting 2: Expected running time of quick sort -- analysis using recursion, and using indicator random variables. In the notes, this corresponds to the analysis in the problem of matching nuts and bolts in Lecture 9.

- Meeting 1: Coupon Collection, and Analysis of Selection. Both of these are not from the lecture notes, so some notes posted within content tab in ICON.
- Meeting 2: Hash Tables from Lecture 12 -- universal families of hash functions and the expected run time bounds they yield for hashing operations.

- Meeting 1: Showing universality for a certain hash function family. Concluding remarks on hashing.
- Meeting 2: Network Flow and Minimum Cuts, from Lecture 23. Slides posted within content tab in ICON.

- Meeting 1: Basic Network Flow Algorithm. Its correctness and the max-flow min-cut theorem. Maximum Matching in bipartite graphs (Lecture 24.3 and 24.4)
- Meeting 2: Baseball elimination (24.5). We covered some stuff that is not in the notes -- using the min-cut to prove Hall's theorem, and to construct a nice certificate that a team has been eliminated. This stuff is on the slides posted on ICON.

- Meeting 1: Project Selection: an application of minimum cut (Lecture 24.6). Reductions: Perfect matching to CNF satisfiability.
- Meeting 2: More reductions involving independent set, CNF sat, and 3-colorability. Towards a definition of the class NP.

- Meeting 1: NP, NP-hardness, and NP-completeness. Circuit satisfiability and the Cook-Levin Theorem. NP-completeness of CNF-SAT. (Lecture 30)
- Meeting 2: NP-completeness of Independent set, vertex cover, and subset sum. (Lecture 30)

- Meeting 1: NP-completeness of subset sum and 3-colorability. (Lecture 30)
- Meeting 2: Solving problems from Lecture 30.