This page summarizes the lectures and mentions the suggested reading assignments.
This summary is quite brief, and its intention is to help the student and
instructor recollect what was covered. It is no substitute for participating
in the lectures. We will be using Jeff Erickson's notes.
Week 1
- Meeting 1: Stable Matching. Section 0.5 (lecture 0, Section 0.5) of
Jeff Erickson's notes.
- Meeting 2: Implementation of the stable matching algorithm so that in runs in O(n^2) time. The Towers of Hanoi problem (Sections 1.2 and 1.3).
Week 2
- Meeting 1: Mergesort (1.4) and Quicksort (1.5).
- Meeting 2: Selection (1.7)
Week 3
- Meeting 1: Selection (1.7) and Integer Multiplication (1.8).
- Meeting 2:
Week 4
- Meeting 1: Completing the discussion of Integer Multiplication (1.8). The subset sum problem from Section 3.3.
- Meeting 2: Recursive thinking for the longest increasing subsequence (LIS) from Section 3.6. An efficient algorithm using memoization, and also via an iterative algorithm. This is from Section 5.2.
Week 5
- Meeting 1: Subset Sum -- memoization, and an iterative algorithm. See 5.6 for a brief summary.
- Meeting 2: Edit Distance from Section 5.5. The recursive thinking, followed by an iterative algorithm and example.
Week 6
- Meeting 1: The randomized protocol for contention resolution. This is not from Jeff Erickson's notes. See Files --> Handouts in ICON for posted notes.
- Meeting 2: Verifying Matrix Product. Also not from Jeff's notes. See Files --> Handouts in ICON for posted notes.
Week 7
- Meeting 1: Randomized minimum cut, from Lecture 14 of Jeff Erickson's notes.
- Meeting 2: Continuing with randomized minimum cut.
Week 8
- Meeting 1: Expectation, linearity, examples of expectation computation. Some of the examples we did can be found in Files --> Handouts --> Expectation and Median in ICON.
- Meeting 2: Analysis of expected running time of quicksort and the randomized slection algorithm. Notes in ICON.
Week 9
- Meeting 1: Expected running time of hash table operations, and universal hash function families. See Lecture 12 of Jeff's notes, and here for a more updated version.
- Meeting 2: An application of universal hash functions to estimating frequency of items in a data stream. See these notes from Jeff.
Week 10
- Meeting 1: A proof that the family MP is near-universal can also be found in Jeff's updated notes on hashing mentioned above in Week 9, Lecture 1. The proof I gave in class avoided the terminology of groups and inverses. The discussion of balls and bins and perfect hashing can be found in the sections `High probability bounds: balls and bins' and `Perfect Hashing' in these same notes.
- Meeting 2: Midterm.
Week 11
- Meeting 1: Maximum flows and minimum cuts, from Lecture 23 of Jeff's notes. Here are the slides we used.
- Meeting 2: Correctness of Ford-Fulkerson, running time with integer capacities, mention of the Edmonds-Karp rules and the running times they give. See Lecture 23.
Week 12
- Meeting 1: Application of Maximum Flow to Maximum Bipartite Matching. See Lecture 24 (Applications of Maximum Flow) of Jeff's notes.
- Meeting 2: A handout for the image segmentation application of minimum cuts is in ICON. The project selection application is in 24.6 of Jeff's lectures.
Week 13
- Meeting 1: Introduction the SAT and CNF-SAT problems. Reductions: LIS to longest path in DAGs.
- Meeting 2: The independent set problem, reduction of some problems, including maximum matching, to independent set. Reduction of CNF-SAT to independent set, covered in Lecture 30 of Jeff's notes. Defining polynomial-time (Cook) reducibility.
Week 14
- Meeting 1: Decision problems vs. Optimization problems. The classes P and NP, with examples. How to show that a problem belongs to NP. The P vs NP question.
- Meeting 2: Polynomial-time Karp reducibility, consequences, definitions of NP-hard and NP-complete, Cook-Levin Theorem, showing NP-completeness of Independent Set.