New Directions in Approximation Algorithms

Instructor: Barna Saha

Office Hour: After class and by appointment.

The area of approximation algorithms was traditionally designed to come up with polynomial time algorithms for NP-Hard problems by allowing returning approximate solutions. With the advent of Big Data area, the need for approximation algorithms goes far beyond of tackling NP Hard problems. We may need approximation algorithms even for polynomial time problems to allow for faster running time, improved space complexity and changing data. This course will cover some of the new emergent areas of approximation algorithms. The course topics are subject to change with each offering.

Following is a list of tentative topics for this semester.

  • Traditional Theory of Approximation Algorithms
  • Fine-Grained Approximation Algorithms
  • Dynamic/Online Algorithms
  • Conditional Hardness & Hardness of Approximation

Time & Venue: The course will meet every Monday and Wednesday from 9:00 am to 10:00 am at  Dwinelle 109.

Grading: Each student will be required to scribe one/two lecture notes, present one paper and do a research project individually or in a group of two. There will be one/two problems given every alternate weeks to help better understand the material. They will be peer-graded and due at the end of the semester. Students are allowed to discuss among themselves but must write their own solution.

Writing lecture notes and class participation account for 10% of the grade, the problem set accounts for 15%, the paper presentations account for 25% of the grade and the project accounts for 50%. Project will have several components: (i) project proposal (less than 1 pg) due on Oct 13, (iii) in-class presentation during the last two weeks of the class, and (iv) a final project report (less than 10 pgs) due on Dec 2.

Books: There is no required textbook. Following will be used for reference in the first part of the course.

Prerequisite: Students must have taken one graduate level courses on algorithms or combinatorial optimization such as IND ENG 269, IND ENG 266, IND ENG 262A, CS 270 and CS 271. If you do not satisfy the prerequisite but want to take the course, please talk to the instructor.

Goal: The goal of the course is to make students enough interested in the course topics to do further exploration.

A tentative schedule is below.

Date            Topic Notes
Aug 28 Overview of the course.
Hard vs Easy. Introduction to Approximation Algorithms.
MAX-SAT via Linear Programming Rounding. 3/4 approximation for MAX SAT.
Chapter 1, Chapter 5 of Williamson-Shmoys.
Course note from Goeman’s class.

Lecture note available on piazza.

Further References:
1. On the Approximation of Maximum Satisfiability, M. Yannakakis, SODA 1992.
2. Randomized Variants of Johnson’s Algorithm for MAX-SAT, M. Poloczek, G. Schnitger, SODA 2011.
3. Simple 3/4-approximation algorithm for MAX-SAT, WAOA 2011.
4. A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization, N. Buchbinder, M. Feldman, J. Naor, R. Schwartz, FOCS 2012.  

Announcement: Sign up for piazza (link sent via email) for class related discussions.
Sept 4 Derandomization via the Method of Conditional Expectation, Covering Problems: Set Cover, Vertex Cover.
Chapter 1 of Williamson-Shmoys,  Chapter 13 of Vazirani.
Chapter 5 of Williamson-Shmoys.

Lecture note available on piazza.


Sept 9 Set Cover continued. Introduction to LP Duality, Primal Dual, Greedy, Dual Fitting. Chapter 1 of Williamson-Shmoys, Chapter 12 and 13 of Vazirani.
Announcement: Reading list announced. Check piazza.

Lecture note available on piazza.
Sept 11 Submodular Optimization: Submodular maximization: Greedy and Local Search.


Lecture note available on piazza.

A pretty extensive coverage of the development in submodular optimization modulo very recent advances.
https://www.openu.ac.il/personal_sites/moran-feldman/publications/Handbook2018.pdf



Notes: See J. Vondrak’s tutorial.
https://theory.stanford.edu/~jvondrak/data/submod-tutorial-1.pdf

Notes on submodular function maximization.
http://swoh.web.engr.illinois.edu/courses/ie512/handout/submodular.pdf


References:
An Analysis of Approximations for Maximizing Submodular Set Functions, Nemhauser, Wolsey, Fisher, Mathematical Programming, 1978.
Maximizing Non-monotone Submodular Functions, Feige, Mirrokni, Vondrak, FOCS 2007.
Submodular Function Minimization under Covering Constraints, S. Iwata, K. Nagano, FOCS 2009.
Sept 16 –18 Continuous Extensions of Submodular Functions

Submodular Covering Problems

Lecture note available on piazza.

Continuous extensions of submodular functions.
https://theory.stanford.edu/~jvondrak/MATH233B-2017/lec15.pdf

Lovasz extension and connection to Edmond’s greedy algorithm
(also many nice examples of submodular functions in practice)
http://people.csail.mit.edu/stefje/mlss/kyoto_mlss_lecture1.pdf
Sept 23Deterministic and Randomized rounding of linear programs: The uncapacitated facility location problem.
Chapter 4.5 of Williamson-Shmoys.
Chapter 5.8 of Williamson-Shmoys.    
Sept 25 The Metric Uncapacitated Facility Location Continued. More Randomized Rounding & Primal-Dual Chapter 7.6 of Williamson-Shmoys
Chapter 12.1 of Williamson-Shmoys
Sept 30-Oct 2 More Primal Dual: The Generalized Steiner Tree ProblemChapter 7 of Williamson-Shmoys
Oct 7, Oct 9Cuts & Metrics Chapter 8 of Williamson-Shmoys
Oct 16, 21Cuts & Metrics continuedChapter 8 of Williamson-Shmoys
Notes on Ellipsoid algorithm
Oct 23-28Semi-definite programming rounding. MAX CUT, MAX-2SAT Chapter 6 of Williamson-Shmoys
Oct 30Hardness of approximation Chapter 10 of Williamson-Shmoys.  
Nov 4 A fine-grained approach to hardness: ETH, SETH, OV. SETH to OV, OV to 2-3 Diameter.   Reference: https://people.csail.mit.edu/virgi/6.s078/  
A fine-grained approach to hardness: More on SETH and ETH.Reference: https://people.csail.mit.edu/virgi/6.s078/  
OV to Sequence Alignment Problems: Longest Common Subsequence, Edit Distance.   References: https://arxiv.org/abs/1412.0348
https://arxiv.org/abs/1501.07053
Amnesic Dynamic Programming: Longest Increasing Subsequence
& RNA Folding.
References:
http://ieee-focs.org/FOCS-2017-Papers/3464a295.pdf
https://arxiv.org/abs/1308.0626
——–Buffer ———    
———-    
Nov 13 Computing Edit Distance Fast References:
https://arxiv.org/abs/1810.03664
Nov 18
APSP, (min,+) matrix multiplication and Equivalences  
References: https://people.csail.mit.edu/rrw/tria-mmult.pdf  
Fast Algorithm for APSP in non-uniform model.References:
https://people.csail.mit.edu/virgi/6.s078/lec12-LDTs.txt  
Fast Approximation Algorithms for APSP.   References:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.25.4076&rep=rep1&type=pdf  
Fast Exact Algorithm for (min,+) matrix multiplication on Lipschitz matrices.   References:
https://arxiv.org/abs/1707.05095  
 
   
Nov 20 Dynamic Problems & Hardness   Reference:
https://people.csail.mit.edu/virgi/6.s078/lec14-dynamic.pdf  
Nov 25 Dynamic Algorithms: design of dynamic algorithms via primal-dualReference:
https://arxiv.org/abs/1604.05337
 
Dec 2 Project Presentation    
Dec 4 Project Presentation