In this tutorial, we will start by discussing some of the basic ideas and techniques that allow a discrete optimization problem to be relaxed or immersed into a continuous optimization problem. Typically, the resulting continuous optimization problem is convex, implying strong properties of duality and efficiency. As part of the first lecture, we will review convexification ideas and strong duality in the context of cone optimization problems. The second and third lectures will focus on many approaches for going back from the continuous problem to the discrete problem, i.e. rounding a solution from the larger, continuous problem into a solution to the discrete problem. This is a very rich area, and we will survey a variety of techniques.
The first session of this mini course will take place on Monday, August 21 from 9:30 - 10:30 AM; the second session of this mini course will take place on Monday, August 21 from 11:00 AM - 12:00 PM; and the third session of this mini course will take place on Tuesday, August 22 from 9:30 - 10:30 AM.
Monday, August 21st, 2017
In this tutorial, we will start by discussing some of the basic ideas and techniques that allow a discrete optimization problem to be relaxed or immersed into a continuous optimization problem. Typically, the resulting continuous optimization problem is convex, implying strong properties of duality and efficiency. As part of the first lecture, we will review convexification ideas and strong duality in the context of cone optimization problems. The second and third lectures will focus on many approaches for going back from the continuous problem to the discrete problem, i.e. rounding a solution from the larger, continuous problem into a solution to the discrete problem. This is a very rich area, and we will survey a variety of techniques.
The first session of this mini course will take place on Monday, August 21 from 9:30 - 10:30 AM; the second session of this mini course will take place on Monday, August 21 from 11:00 AM - 12:00 PM; and the third session of this mini course will take place on Tuesday, August 22 from 9:30 - 10:30 AM.
In these introductory lectures we will present a unified introduction to Semidefinite Programming hierarchies and Sum of Squares techniques for continuous/discrete optimization, highlighting their many different and complementary interpretations (Lagrangian/algebraic duality, geometric embeddings, proof systems, probabilistic, etc). We will concisely summarize the state of the art in both positive and negative results (e.g., planted clique, tensor completion, constraint satisfaction), as well as current algorithmic approaches.
The first session of this mini course will take place on Monday, August 21 from 2:00 - 3:00 PM; the second session of this mini course will take place on Monday, August 21 from 3:30 - 4:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Wednesday, August 23 from 11:00 AM - 12:00 PM.
In these introductory lectures we will present a unified introduction to Semidefinite Programming hierarchies and Sum of Squares techniques for continuous/discrete optimization, highlighting their many different and complementary interpretations (Lagrangian/algebraic duality, geometric embeddings, proof systems, probabilistic, etc). We will concisely summarize the state of the art in both positive and negative results (e.g., planted clique, tensor completion, constraint satisfaction), as well as current algorithmic approaches.
The first session of this mini course will take place on Monday, August 21 from 2:00 - 3:00 PM; the second session of this mini course will take place on Monday, August 21 from 3:30 - 4:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Wednesday, August 23 from 11:00 AM - 12:00 PM.
Tuesday, August 22nd, 2017
In this tutorial, we will start by discussing some of the basic ideas and techniques that allow a discrete optimization problem to be relaxed or immersed into a continuous optimization problem. Typically, the resulting continuous optimization problem is convex, implying strong properties of duality and efficiency. As part of the first lecture, we will review convexification ideas and strong duality in the context of cone optimization problems. The second and third lectures will focus on many approaches for going back from the continuous problem to the discrete problem, i.e. rounding a solution from the larger, continuous problem into a solution to the discrete problem. This is a very rich area, and we will survey a variety of techniques.
The first session of this mini course will take place on Monday, August 21 from 9:30 - 10:30 AM; the second session of this mini course will take place on Monday, August 21 from 11:00 AM - 12:00 PM; and the third session of this mini course will take place on Tuesday, August 22 from 9:30 - 10:30 AM.
Techniques and insights from convex optimization play an important role in the design of algorithms for discrete optimization, as evidenced by the success of convex relaxations in providing a framework for constructing and analyzing approximation algorithms. In the last decade, iterative methods from convex optimization have served as a springboard to a number of algorithmic advances for many discrete optimization tasks, including fundamental ones such as submodular optimization and maximum flow problems.
In this boot camp series, we will provide a walk-through of the continuous approach to algorithm design in discrete optimization. We will consider the algorithmic challenges that arise in the choice of formulation and iterative algorithm by casting first-order methods as discretizations of continuous-time dynamics. We will see that the non-smooth nature of discrete optimization problems leads us to smoothen our objectives via regularization and discuss how different choices of regularizers arise in different settings. We will then put these principles to work in the contexts of submodular optimization and competitive analysis of online algorithms.
The first session of this mini course will take place on Tuesday, August 22 from 11:00 AM - 12:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 1:30 - 2:30 PM; the third session of this mini course will take place on Friday, August 25 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Friday, August 25 from 11:00 AM - 12:00 PM.
Techniques and insights from convex optimization play an important role in the design of algorithms for discrete optimization, as evidenced by the success of convex relaxations in providing a framework for constructing and analyzing approximation algorithms. In the last decade, iterative methods from convex optimization have served as a springboard to a number of algorithmic advances for many discrete optimization tasks, including fundamental ones such as submodular optimization and maximum flow problems.
In this boot camp series, we will provide a walk-through of the continuous approach to algorithm design in discrete optimization. We will consider the algorithmic challenges that arise in the choice of formulation and iterative algorithm by casting first-order methods as discretizations of continuous-time dynamics. We will see that the non-smooth nature of discrete optimization problems leads us to smoothen our objectives via regularization and discuss how different choices of regularizers arise in different settings. We will then put these principles to work in the contexts of submodular optimization and competitive analysis of online algorithms.
The first session of this mini course will take place on Tuesday, August 22 from 11:00 AM - 12:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 1:30 - 2:30 PM; the third session of this mini course will take place on Friday, August 25 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Friday, August 25 from 11:00 AM - 12:00 PM.
In this session, we will provide an overview of stochastic and robust optimization, with problems in statistics and machine learning providing a motivating focus. We will discuss connections between convex optimization and regret minimization, both with stochastic and deterministic feedback. We will view these problems through the minimax lens and show how algorithms emerge as approximate dynamic programming solutions. Finally, we will review robust optimization and explore the interactions between robustness and regularization.
The first session of this mini course will take place on Tuesday, August 22 from 3:00 - 4:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 4:30 - 5:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 4:30 - 5:30 PM.
In this session, we will provide an overview of stochastic and robust optimization, with problems in statistics and machine learning providing a motivating focus. We will discuss connections between convex optimization and regret minimization, both with stochastic and deterministic feedback. We will view these problems through the minimax lens and show how algorithms emerge as approximate dynamic programming solutions. Finally, we will review robust optimization and explore the interactions between robustness and regularization.
The first session of this mini course will take place on Tuesday, August 22 from 3:00 - 4:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 4:30 - 5:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 4:30 - 5:30 PM.
Wednesday, August 23rd, 2017
In these introductory lectures we will present a unified introduction to Semidefinite Programming hierarchies and Sum of Squares techniques for continuous/discrete optimization, highlighting their many different and complementary interpretations (Lagrangian/algebraic duality, geometric embeddings, proof systems, probabilistic, etc). We will concisely summarize the state of the art in both positive and negative results (e.g., planted clique, tensor completion, constraint satisfaction), as well as current algorithmic approaches.
The first session of this mini course will take place on Monday, August 21 from 2:00 - 3:00 PM; the second session of this mini course will take place on Monday, August 21 from 3:30 - 4:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Wednesday, August 23 from 11:00 AM - 12:00 PM.
In these introductory lectures we will present a unified introduction to Semidefinite Programming hierarchies and Sum of Squares techniques for continuous/discrete optimization, highlighting their many different and complementary interpretations (Lagrangian/algebraic duality, geometric embeddings, proof systems, probabilistic, etc). We will concisely summarize the state of the art in both positive and negative results (e.g., planted clique, tensor completion, constraint satisfaction), as well as current algorithmic approaches.
The first session of this mini course will take place on Monday, August 21 from 2:00 - 3:00 PM; the second session of this mini course will take place on Monday, August 21 from 3:30 - 4:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Wednesday, August 23 from 11:00 AM - 12:00 PM.
In this tutorial we will cover the recent progress on lower bounds on the size of linear programs and semidefinite programs for combinatorial optimization problems. We will present the definition and motivation of extension complexity and the relation to communication complexity. Then we will discuss the lower bound of Fiorini et al for the TSP and correlation polytope. We will also review SDP extension complexity with related concept such as the SDP rank of matrices. Finally, we will discuss lower bounds based on Fourier-methods by CLRS and LRS for both the LP model and the SDP model.
The first session of this mini course will take place on Wednesday, August 23 from 1:30 - 2:30 PM; the second session of this mini course will take place on Wednesday, August 23 from 3:00 - 4:00 PM; the third session of this mini course will take place on Thursday, August 24 from 3:00 - 4:00 PM; and the fourth session of this mini course will take place on Thursday, August 24 from 4:30 - 5:30 PM.
In this tutorial we will cover the recent progress on lower bounds on the size of linear programs and semidefinite programs for combinatorial optimization problems. We will present the definition and motivation of extension complexity and the relation to communication complexity. Then we will discuss the lower bound of Fiorini et al for the TSP and correlation polytope. We will also review SDP extension complexity with related concept such as the SDP rank of matrices. Finally, we will discuss lower bounds based on Fourier-methods by CLRS and LRS for both the LP model and the SDP model.
The first session of this mini course will take place on Wednesday, August 23 from 1:30 - 2:30 PM; the second session of this mini course will take place on Wednesday, August 23 from 3:00 - 4:00 PM; the third session of this mini course will take place on Thursday, August 24 from 3:00 - 4:00 PM; and the fourth session of this mini course will take place on Thursday, August 24 from 4:30 - 5:30 PM.
In this session, we will provide an overview of stochastic and robust optimization, with problems in statistics and machine learning providing a motivating focus. We will discuss connections between convex optimization and regret minimization, both with stochastic and deterministic feedback. We will view these problems through the minimax lens and show how algorithms emerge as approximate dynamic programming solutions. Finally, we will review robust optimization and explore the interactions between robustness and regularization.
The first session of this mini course will take place on Tuesday, August 22 from 3:00 - 4:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 4:30 - 5:30 PM; the third session of this mini course will take place on Wednesday, August 23 from 4:30 - 5:30 PM.
Thursday, August 24th, 2017
In this tutorial we will cover the recent progress on lower bounds on the size of linear programs and semidefinite programs for combinatorial optimization problems. We will present the definition and motivation of extension complexity and the relation to communication complexity. Then we will discuss the lower bound of Fiorini et al for the TSP and correlation polytope. We will also review SDP extension complexity with related concept such as the SDP rank of matrices. Finally, we will discuss lower bounds based on Fourier-methods by CLRS and LRS for both the LP model and the SDP model.
The first session of this mini course will take place on Wednesday, August 23 from 1:30 - 2:30 PM; the second session of this mini course will take place on Wednesday, August 23 from 3:00 - 4:00 PM; the third session of this mini course will take place on Thursday, August 24 from 3:00 - 4:00 PM; and the fourth session of this mini course will take place on Thursday, August 24 from 4:30 - 5:30 PM.
In this tutorial we will cover the recent progress on lower bounds on the size of linear programs and semidefinite programs for combinatorial optimization problems. We will present the definition and motivation of extension complexity and the relation to communication complexity. Then we will discuss the lower bound of Fiorini et al for the TSP and correlation polytope. We will also review SDP extension complexity with related concept such as the SDP rank of matrices. Finally, we will discuss lower bounds based on Fourier-methods by CLRS and LRS for both the LP model and the SDP model.
The first session of this mini course will take place on Wednesday, August 23 from 1:30 - 2:30 PM; the second session of this mini course will take place on Wednesday, August 23 from 3:00 - 4:00 PM; the third session of this mini course will take place on Thursday, August 24 from 3:00 - 4:00 PM; and the fourth session of this mini course will take place on Thursday, August 24 from 4:30 - 5:30 PM.
Interior point methods are one of the key approaches to solving linear programming formulations as well as other convex programs. They give rise to algorithms that not only are the fastest ones known from asymptotic analysis point of view but also are often superior in practice.
This series of four lectures will present fundamentals of interior point methods as well as touch on the key ideas and concepts behind some of the recent developments. The topics we plan to cover include: (1) background on linear programming optimality conditions, the central path and its neighborhoods, and Newton's method; (2) complete analysis of a primal-dual path-following algorithm for linear programming and its relationship to a basic primal barrier method; (3) how to perturb central path to speed up convergence of an interior point method; and (4) a variant of interior point method that achieves running times that scales with the intrinsic dimension of the problem (rather than its possibly larger standard representation).
No previous experience with interior point methods is required.
The first session of this mini course will take place on Thursday, August 24 from 9:30 - 10:30 AM; the second session of this mini course will take place on Thursday, August 24 from 11:00 AM - 12:00 PM; the third session of this mini course will take place on Thursday, August 24 from 1:30 - 2:30 PM; and the fourth session of this mini course will take place on Friday, August 25 from 1:30 - 2:30 PM.
Interior point methods are one of the key approaches to solving linear programming formulations as well as other convex programs. They give rise to algorithms that not only are the fastest ones known from asymptotic analysis point of view but also are often superior in practice.
This series of four lectures will present fundamentals of interior point methods as well as touch on the key ideas and concepts behind some of the recent developments. The topics we plan to cover include: (1) background on linear programming optimality conditions, the central path and its neighborhoods, and Newton's method; (2) complete analysis of a primal-dual path-following algorithm for linear programming and its relationship to a basic primal barrier method; (3) how to perturb central path to speed up convergence of an interior point method; and (4) a variant of interior point method that achieves running times that scales with the intrinsic dimension of the problem (rather than its possibly larger standard representation).
No previous experience with interior point methods is required.
The first session of this mini course will take place on Thursday, August 24 from 9:30 - 10:30 AM; the second session of this mini course will take place on Thursday, August 24 from 11:00 AM - 12:00 PM; the third session of this mini course will take place on Thursday, August 24 from 1:30 - 2:30 PM; and the fourth session of this mini course will take place on Friday, August 25 from 1:30 - 2:30 PM.
Interior point methods are one of the key approaches to solving linear programming formulations as well as other convex programs. They give rise to algorithms that not only are the fastest ones known from asymptotic analysis point of view but also are often superior in practice.
This series of four lectures will present fundamentals of interior point methods as well as touch on the key ideas and concepts behind some of the recent developments. The topics we plan to cover include: (1) background on linear programming optimality conditions, the central path and its neighborhoods, and Newton's method; (2) complete analysis of a primal-dual path-following algorithm for linear programming and its relationship to a basic primal barrier method; (3) how to perturb central path to speed up convergence of an interior point method; and (4) a variant of interior point method that achieves running times that scales with the intrinsic dimension of the problem (rather than its possibly larger standard representation).
No previous experience with interior point methods is required.
The first session of this mini course will take place on Thursday, August 24 from 9:30 - 10:30 AM; the second session of this mini course will take place on Thursday, August 24 from 11:00 AM - 12:00 PM; the third session of this mini course will take place on Thursday, August 24 from 1:30 - 2:30 PM; and the fourth session of this mini course will take place on Friday, August 25 from 1:30 - 2:30 PM.
Friday, August 25th, 2017
Techniques and insights from convex optimization play an important role in the design of algorithms for discrete optimization, as evidenced by the success of convex relaxations in providing a framework for constructing and analyzing approximation algorithms. In the last decade, iterative methods from convex optimization have served as a springboard to a number of algorithmic advances for many discrete optimization tasks, including fundamental ones such as submodular optimization and maximum flow problems.
In this boot camp series, we will provide a walk-through of the continuous approach to algorithm design in discrete optimization. We will consider the algorithmic challenges that arise in the choice of formulation and iterative algorithm by casting first-order methods as discretizations of continuous-time dynamics. We will see that the non-smooth nature of discrete optimization problems leads us to smoothen our objectives via regularization and discuss how different choices of regularizers arise in different settings. We will then put these principles to work in the contexts of submodular optimization and competitive analysis of online algorithms.
The first session of this mini course will take place on Tuesday, August 22 from 11:00 AM - 12:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 1:30 - 2:30 PM; the third session of this mini course will take place on Friday, August 25 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Friday, August 25 from 11:00 AM - 12:00 PM.
Techniques and insights from convex optimization play an important role in the design of algorithms for discrete optimization, as evidenced by the success of convex relaxations in providing a framework for constructing and analyzing approximation algorithms. In the last decade, iterative methods from convex optimization have served as a springboard to a number of algorithmic advances for many discrete optimization tasks, including fundamental ones such as submodular optimization and maximum flow problems.
In this boot camp series, we will provide a walk-through of the continuous approach to algorithm design in discrete optimization. We will consider the algorithmic challenges that arise in the choice of formulation and iterative algorithm by casting first-order methods as discretizations of continuous-time dynamics. We will see that the non-smooth nature of discrete optimization problems leads us to smoothen our objectives via regularization and discuss how different choices of regularizers arise in different settings. We will then put these principles to work in the contexts of submodular optimization and competitive analysis of online algorithms.
The first session of this mini course will take place on Tuesday, August 22 from 11:00 AM - 12:00 PM; the second session of this mini course will take place on Tuesday, August 22 from 1:30 - 2:30 PM; the third session of this mini course will take place on Friday, August 25 from 9:30 - 10:30 AM; and the fourth session of this mini course will take place on Friday, August 25 from 11:00 AM - 12:00 PM.
Interior point methods are one of the key approaches to solving linear programming formulations as well as other convex programs. They give rise to algorithms that not only are the fastest ones known from asymptotic analysis point of view but also are often superior in practice.
This series of four lectures will present fundamentals of interior point methods as well as touch on the key ideas and concepts behind some of the recent developments. The topics we plan to cover include: (1) background on linear programming optimality conditions, the central path and its neighborhoods, and Newton's method; (2) complete analysis of a primal-dual path-following algorithm for linear programming and its relationship to a basic primal barrier method; (3) how to perturb central path to speed up convergence of an interior point method; and (4) a variant of interior point method that achieves running times that scales with the intrinsic dimension of the problem (rather than its possibly larger standard representation).
No previous experience with interior point methods is required.
The first session of this mini course will take place on Thursday, August 24 from 9:30 - 10:30 AM; the second session of this mini course will take place on Thursday, August 24 from 11:00 AM - 12:00 PM; the third session of this mini course will take place on Thursday, August 24 from 1:30 - 2:30 PM; and the fourth session of this mini course will take place on Friday, August 25 from 1:30 - 2:30 PM.