Learning & Games Visitor Speaker Series: Universal Acceleration for Minimax Optimization
Niao He (ETH)
Calvin Lab Room 116 or Zoom
Title: Universal Acceleration for Minimax Optimization
Abstract: We present a generic acceleration recipe for smooth minimax optimization. By simply combing with existing solvers such as extra-gradient method as the workhorse for subproblems, one can achieve best-known convergence rates for minimax optimization in various regimes such as the strongly-convex-(strongly)-
Bio: Niao He is currently an Assistant Professor in the Department of Computer Science at ETH Zurich, where she leads the Optimization and Decision Intelligence (ODI) Group. She is an ELLIS Scholar and a core faculty member of ETH AI Center. Previously, she was an assistant professor at the University of Illinois at Urbana-Champaign from 2016 to 2020. Before that, she received her Ph.D. degree in Operations Research from Georgia Institute of Technology in 2015. Her research interests lie in the intersection of optimization and machine learning, with a primary focus on minimax optimization and reinforcement learning.