Can Non-Convex Optimization be Robust?
Rong Ge (Duke University)
Optimizing a non-convex function is hard in general. Existing analysis for non-convex optimization then relies on problem specific structures that may not be robust to adversarial perturbations. In this talk, we will see two scenarios where the standard non-convex optimization techniques are not robust, and show how to modify the algorithms to handle adversarial perturbations. The first scenario considers the matrix completion problem against a semi-random adversary that can reveal more entries of the matrix. Although this weak adversary is harmless to convex relaxations, we show that it can ruin non-convex approaches, and give a fast algorithm to fix this problem. The second scenario considers the more general setting where one does not have access to the exact function to optimize, where we show that it is still possible to find an approximate local optimal solution. Based on joint works with Yu Cheng, Chi Jin, Lydia Liu, Michael I. Jordan.