Algorithmic Excursions: Topics in Computer Science II, CS:4980:0002 (22C:196:002)
Coordinates
The course meets 11:00 AM-- 12:15 pm Tuesday and Thursday
at 27 MH (MacBride Hall).
Instructor
Kasturi Varadarajan, 101D MacLean Hall, Phone: 335-0732, email:firstname-lastname@uiowa.edu. Office Hours: 1:30--3:00 Wednesday, 3:30--5:00 Thursday.
What this Course is About
In this course, we explore some advanced topics in algorithm design and analysis that are of broad interest. We study (1) approximation algorithms, that output approximately optimal solutions, focusing on geometric examples; (2) randomized schemes such as sampling and sketching; (3) algorithms in the data stream model, where the space allowed to the algorithm is typically sublinear and only a few passes through the input are allowed; (4) the multiplicative weights update method and some of its applications.
We also study topics beyond typical worst-case analysis. We examine parameterized complexity, where the input is characterized by one or more parameters in addition to the usual input size. We consider input models where the input is an `easy' instance corrupted by random noise, and review algorithm design in this setting.
References
We expect to use material drawn from a few different sources.
- Geometric Approximation Algorithms, by Sariel Har-Peled; this will serve as a useful reference for about a third of the course. Note that an electronic version of the text is available at the UI library.
- For the portion of the course discussing streaming algorithms, useful references are the lecture notes of Amit Chakrabarti and Jelani Nelson.
- For the portion on the multiplicative weights update method, our reference is the survey by Arora, Hazan, and Kale, titled, `The Multiplicative Weights Update Method: A Meta-Algorithm and Applications'.
Prerequisites
A strong background in an undergraduate level algorithms course, together with an understanding of the basic methodology of analyzing randomized algorithms is expected. The latter translates to an understanding of random variables, expectation, linearity of expectation, Markov and Chebyshev inequalities, Chernoff bounds, and their typical uses in analyzing randomized algorithms.
Grading
Each student will be responsible for a week's worth of lecture notes (slightly more if needed). This will account for 20 percent of the grade. Homeworks will account for 40 percent of the grade -- I expect there will be three to four homeworks. Class participation will account for 10 percent of the grade. And a review or survey paper will account for the remaining 30 percent. There will be no exams.
Scribed lecture notes are due within the next week after the week the student is responsible for.
The policy on late homeworks is that you have a quota of six days
for the entire semester that you may use for late submissions. A maximum of three late days can be applied to any one homework. Once you use up your quota of
six days, any homework submitted late will not be accepted.
When you submit a homework X days late, your quota gets decreased by X
irrevocably. You can only be late by an integer number of days, obtained by
rounding up.
Handouts and Homeworks
Lecture Notes
All scribed notes are subject to revision -- if you spot errors do let me know.
- Week 1: Sampling in geometric spaces, VC dimension, shatter function, and
epsilon nets. Scribed notes (may be revised).
- Week 2: The above continued. Epsilon Approximations and Discrepancy.
Scribed notes (may be revised).
A good reference for this stuff is Chapter 5 of `Geometric Approximation Algorithms', by Sariel Har-Peled. An electronic version is available at the UI library.
- Week 3: Epsilon Approximations via Discrepancy. Epsilon Nets from Epsilon Approximations. The Epsilon Net Theorem -- the double sampling proof. This can be found in Har-Peled's book. I followed an exposition due to Matousek, which I have posted within `Content' on ICON. Scribed notes
- Week 4: Survey of other results in epsilon nets, approximations, and discrepancy, along with possible topics for term paper. Clustering -- k-center clustering, and intro to the k-median clustering problem. Chapter 4 of Har-Peled is a good source. Scribed notes
- Week 5: Local search for k-median clustering (Chapter 4 of Har-Peled). The k-means heuristic. Term paper Topics. Scribed notes
- Week 6: Streaming Algorithms: Morris' counter. Notes from Jelani Nelson's course linked above are a good reference. Scribed notes.
- Week 7: Estimating the number of distinct elements in a data stream. Deterministic Algorithm for Frequent Elements. Lectures 2 and 1 from Amit Chakrabarti's notes linked above are a good reference. Scribed notes.
- Week 8: Randomized algorithm for frequent elements in the turnstile model. Randomized algorithm for estimating the frequency moment F2. Lectures 4 and 6 from Amit Chakrabarti. Scribed notes.
- Week 9: The weighted majority and multiplicative weights update algorithm. See scribed notes, and the survey by Arora, Hazan, and Kale referenced in them.
- Week 10: Application to learning a linear classifier (Section 3.1 of above survey) and solving zero-sum games (Section 3.2). Scribed Notes.
- Week 11: Completing application to zero-sum games, and application to learning theory and boosting (Section 3.6 of above Survey). Scribed Notes.
- Week 12: We covered the aggregation result from Mossel and Braverman's paper `Sorting from Noisy Information.' We did not get to the proofs in detail, but instead obtained an idea of how the result is obtained. Scribed Notes.
- Week 14: To introduce smoothed analysis, we overviewed some of the results in the paper `Smoothed Analysis of Left-to-Right Maxima with Applications' by Damerow et al. We established Theorem 3.1 of that paper, bounding the number of left-to-right maxima under uniform noise. We introduced the Nemhauser-Ullman algorithm for the Knapsack problem.
- Week 15: We showed that the smoothed running time of the Nemhauser-Ullman algorithm is polynomial. This is a result of Beier and Vocking from `Random Knapsack in Expected Polynomial Time'. For the analysis, we used an isolating lemma from their other paper `Typical properties of Winners and Losers in Combinatorial Optimization.'