In this paper, we introduce a stochastic projected subgradient method forweakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions---awide class of functions which includes the additive and convex compositeclasses. At a high-level, the method is an inexact proximal point iteration inwhich the strongly convex proximal subproblems are quickly solved with aspecialized stochastic projected subgradient method. The primary contributionof this paper is a simple proof that the proposed algorithm converges at thesame rate as the stochastic gradient method for smooth nonconvex problems. Thisresult appears to be the first convergence rate analysis of a stochastic (oreven deterministic) subgradient method for the class of weakly convexfunctions.
展开▼