We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution v = e~(-f) on R~n. We prove a convergence guarantee in Kullback-Leibler (KL) divergence assuming v satisfies log-Sobolev inequality and f has bounded Hessian. Notably, we do not assume convexity or bounds on higher derivatives. We also prove convergence guarantees in Renyi divergence of order q > 1 assuming the limit of ULA satisfies either log-Sobolev or Poincare inequality.
展开▼