Events
Fall 2017

Optimization Seminar

Thursday, November 16th, 2017, 10:30 am12:00 pm

Add to Calendar

Location: 

Calvin Lab auditorium

Lazy Conditional Gradients through Simpler Oracles

Conditional Gradient Descent methods are popular first-order methods for (smooth) constraint convex minimization. Relying on a linear programming oracle, these methods often outperform projected gradient descent methods whenever projections into the feasible region are expensive. Unfortunately, even those methods might suffer from prohibitive running times if the linear programming oracle itself is expensive, e.g., when the feasible region corresponds to a hard combinatorial optimization problem.

In this talk, we will explore a general method to significantly speed-up conditional gradient descent methods by replacing the linear programming oracle with a significantly easier and cheaper oracle leading to real-world speedups by several orders of magnitude while maintaining identical theoretical converge rates modulo (small!) constant factors. Moreover, we will further outline a conditionally accelerated lazy stochastic gradient descent method (CAL-SGD) that achieves optimal bounds in terms of required gradient evaluations and calls to the new oracle matching those for the more complex linear programming oracle.

(based on joint works with Gábor Braun, George Lan, Yi Zhou, Daniel Zink)