An SDE Perspective on Stochastic Convex Optimization
Résumé
In this paper, we analyze the global and local behavior of gradient-like flows under stochastic errors towards the aim of solving convex optimization problems with noisy gradient input. We first study the unconstrained differentiable convex case, using a stochastic differential equation where the drift term is minus the gradient of the objective function and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz continuity of the gradient, our first main result shows almost sure convergence of the objective and the trajectory process towards a minimizer of the objective function. We also provide a comprehensive complexity analysis by establishing several new pointwise and ergodic convergence rates in expectation for the convex, strongly convex, and (local) Lojasiewicz case. The latter involves a challenging local analysis which requires non-trivial arguments from measure theory. Then, we extend our study to the constrained case and more generally to nonsmooth problems. We show that several of our results have natural extensions obtained by replacing the gradient of the objective function by a cocoercive monotone operator. This makes it possible to obtain similar convergence results for optimization problems with an additively "smooth + non-smooth" convex structure. Finally, we consider another extension of our results to non-smooth optimization which is based on the Moreau envelope.
Mots clés
Convex optimization Stochastic Differential Equation Stochastic gradient descent Lojasiewicz inequality KL inequality Convergence rate Asymptotic behavior AMS subject classifications. 37N40 46N10 49M99 65B99 65K05 65K10 90B50 90C25 60H10 49J52 90C53
Convex optimization
Stochastic Differential Equation
Stochastic gradient descent
Lojasiewicz inequality
KL inequality
Convergence rate
Origine | Fichiers produits par l'(les) auteur(s) |
---|