Fall 2019, 9:30-10:45 TTh 15 SH (Schaeffer Hall)

**Instructor:**
Sriram V. Pemmaraju

101G MLH, sriram-pemmaraju@uiowa.edu, 319-353-2956

Office Hours: 1:30-2:30 M, 10:30-11:30 W, 2:00-3:00 F (and by appointment).

**Course webpage:** `homepage.cs.uiowa.edu/~sriram/5350/fall19/`

**Department website:** `http://www.cs.uiowa.edu/`

This is the graduate algorithms course aimed at CS PhD students. It will be assumed that a typical student in the course has a solid background in undergraduate-level algorithms material. Please refer to the syllabus of a recent offering of CS:3330 (e.g., https://homepage.divms.uiowa.edu/~sriram/3330/spring18/syllabus.html) to better understand this expectation. Facility in discrete math, especially discrete probability, will be a definite benefit for students in this course. The course material is organized into 4 parts: (i) combinatorial optimization, (ii) randomized algorithms, (iii) approximation algorithms, and (iv) streaming and parallel algorithms. The material on streaming and parallel algorithms can be viewed as a modern application of the material in Parts (I)-(III).

Syllabus document, Announcements, Homeworks and Exams, Online Resources, Weekly Topics

- There will be an open book/notes mid-term exam on Oct 10th, 6:30pm--8:30pm, Thu W107 PBB (Pappajohn Business Building). Click here for more details.

- Homework 1. Due: Tue, 9/17.
- Homework 2. Due: Tue, 10/8.
- A couple of simple practice problems on Distributed MIS, in preparation for the midterm.
- Homework 3. Due: Thu, 11/8.
- Homework 4. Due: Thu, 12/12.
- Final Exam. Due: Mon, 12/16, 5 pm by email.

- Sample latex source files for scribe notes are in this directory.
- The appendices in the textbook
*Introduction to Algorithms*(2nd ed) by Cormen, Leiserson, Rivest, and Shamir contain a helpful overview of discrete math that will be used in this course. Specifically, Appendix C on "Counting and Probability" will be helpful for students seeking to brush up their discrete probability knowledge. - Read the proof of the specific version of the flow-decomposition theorem we discussed in class, in these lecture notes by Luca Trevisan. See Lemma 2 and its proof.

**8/26-8/30**

**(8/27)**Overview of course content. We placed flow, cut, and matching problems in the landscape of graph problems. Definitions of flow, flow conservation, feasible flow, and flow value. Statement of the MAXFLOW problem. 8.27 Notes

**(8/29)**Applications of the MAXFLOW problem. Example of a reduction of the Baseball Elimination problem to MAXFLOW. Linear programming (LP) formulation of the MAXFLOW problem. 8.29 Notes

**Readings:**Sections 10.1, 10.2, and 11.6.**9/2-9/6**

**(9/3)**Statement of the MINCUT problem. Introduction to the duality between MAXFLOW and MINCUT. Proof of the weak version of the maxflow-mincut theorem. Description of the proof stretegy of the maxflow-mincut theorem. 9.3 Notes

**(9/5)**Definitions of residual capacities, residual graph, and augmenting paths. Working on the proof of the maxflow-mincut theorem.9.5 Notes

**Readings:**Sections 10.3 and 10.4.**9/9-9/13**

**(9/10)**Completion of the proof of the maxflow-mincut theorem. Analysis of the Ford-Fulkerson algorithm. Introduction to the Edmonds-Karp "fat pipe" heuristic.

**(9/12)**The flow decomposition theorem. Analysis of the Edmonds-Karp "fat pipe" heuristic. Analysis of the Edmonds-Karp "fewest pipes" heuristic.9.12 Notes

**Readings:**Section 10.6.**9/16-9/20**

**(9/17)**Completion of analysis of the Edmonds-Karp "fewest pipes" heuristic. Introduction to LP duality. The weak duality and strong duality theorems. 9.17 Notes

**(9/19)**More on LP duality. The primal LP for max-flow and its dual LP. 9.19 Notes

**Readings:**Section 10.6, specifically the proofs of Lemma 10.2 and Lemma 10.3, Luca Trevisan's lecture notes on LP duality and his lecture notes on flow and cut LPs.**9/23-9/27**

**(9/24)**Interpretation of the dual LP as a min-cut relaxation. Proving that the integral version of the dual LP models the min-cut problem. Concluding via the maxflow-mincut theorem that the dual LP has an integral optimal solution. 9.24 Notes

**(9/26)**Introduction to the maximum matching problem. Bipartite matchiing, a special case of the maximum matching problem. Reducing the bipartite matching problem to maximum flow. Introduction to the distributed Maximal Independent Set (MIS) problem. A deterministic, greedy O(n)-round algorithm for the problem.9.26 Notes

**Readings:**Luca Trevisan's notes on maximum matching in bipartite graphs.**9/30-10/4**

**(10/1)**Description of Luby's randomized algorithm for the distributed MIS problem. Starting the analysis for showing that the algorithm runs in O(log Delta log n) rounds.10.1 Notes

**(10/3)**Analysis of Luby's algorithm showing that in an iteration, a high degree node has at least constant probability of being deactivated.10.3 Notes

**Readings:**Gopal Pandurangan's book on distributed MIS. Read Chapter 6, pages 69-75 (until "Better Analysis").**10/7-10/11**

**(10/8)**Completed the analysis of Luby's algorithm. 10.8 Notes

**(10/10)**Solved problems, in preparation for the midterm, on (i) Luby's MIS algorithm analysis, (ii) reducing problems to maxflow/mincut problems, and (iii) integer programs, LP relaxations, LP duality. 10.10 Notes

**Readings:**Gopal Pandurangan's book on distributed MIS. Read Chapter 6, pages 69-75 (until "Better Analysis").**10/14-10/18**

**(10/15)**Defining the running time and error probability of randomized algorithms, Types of randomized algorithms:*Las Vegas*and*Monte Carlo*. ``Toy'' examples of Las Vegas and Monte Carlo algorithms using the Balanced Partition problem. The MINCUT problem and introduction to Karger's MINCUT algorithm. 10.15 Notes

**(10/17)**Analysis of Karger's MINCUT algorithm. 10.17 Notes

**Readings:**No readings.**10/21-10/25**

**(10/22)***Linearity of expectation*,*indicator*random variables. Example of the use of linearity of expectation in the analysis of randomized Quicksort. Proof that randomized Quicksort runs in O(n log n) expected time. Another example of linearity of expectation in action:*Coupon Collector's problem*. 10.22 Notes

**(10/24)**Analysis of Coupon Collector's problem.*Markov's Inequality*. The use of Markov's Inequality in the analysis of a 1/4-approximation to MAXCUT. 10.24 Notes

**Readings:**Lecture notes on randomized algorithms from Stanford..**10/28-11/1**

**(10/29)**Introduction to*approximation algorithms*. Definition of an*alpha-approximation*algorithm for minimization/maximization problems. A 2-approximation algorithm for Minimum Vertex Cover (MVC) via maximal matchings. 10.29 Notes

**(10/31)**Analysis of 2-approximation algorithm for MVC. A*primal-dual*view of this algorithm using LP duality. A 2-approximation algorithm for MVC using*deterministic LP-rounding*. 10.31 Notes

**Readings:**No readings.**11/4-11/8**

**(11/5)**Greedy approximation algorithms. A greedy 2-approximation algorithm for the k-CENTER problem. 11.5 Notes

**(11/7)**Analysis of k-CENTER greedy 2-approximation algorithm. Reduction from Minimum Dominating Set showing that a rho-approximation, for rho strictly less than 2, for k-CENTER is not possible unless P = NP. 11.7 Notes

**Readings:**Lecture notes by Michael Dinitz.**11/11-11/15**

**(11/12)**The SETCOVER problem. A greedy approximation-algorithm for SETCOVER, yielding an O(log m)-approximation, where m = size of ground set. 11.12 Notes

**(11/14)**A pseudopolynomial time dynamic programming algorithm for KNAPSACK. Idea of a Fully Polynomial-time Approximation Scheme (FPTAS) for KNAPSACK.11.14 Notes (unedited)

**Readings:**Section 1.6 (A greedy algorithm for SETCOVER) and Section 3.1 (The Knapsack problem) from "The Design of Approximation Algorithms" by Williamson and Shmoys.**11/18-11/22**

**(11/19)**Analysis of FPTAS for KNAPSACK.11.19 Notes (unedited)

**(11/21)**Introduction to approximation algorithms via LP-rounding. An f-approximation for SETCOVER via deterministic LP-rounding. An O(log m)-approximation via randomized rounding.11.21 Notes (unedited)

**Readings:**Section 3.1 (The Knapsack problem) and Section 1.7 (A randomized rounding algorithm for SETCOVER) from "The Design of Approximation Algorithms" by Williamson and Shmoys.**12/2-12/6**

**(12/3)**Finished up analysis of randomized rounding for SETCOVER. MAXSAT: a randomized 1/2-approximation for MAXSAT.12.3 Notes (unedited)

**(12/5)**A (1-1/e)-approximation via LP randomized rounding for MAXSAT. A "best of two algorithms" approach to get a 3/4-approximation to MAXSAT.

**Readings:**Sections 5.4 and 5.5 (Randomized Rounding for MAXSAT) from "The Design of Approximation Algorithms" by Williamson and Shmoys.**12/9-12/13**

**(12/10)**Derandomization of MAXSAT randomized approximation algorithm, using the method of conditional expectations.12.10 Notes (unedited)

**(12/12)**Solved practice problems for finale exam.