We present two convergence theorems for Hamilton-Jacobi equations and we apply them to the convergence of approximations and perturbations of optimal control problems and of two-players zero-sum differential games. One of our results is, for instance, the following. LetTandThbe the minimal time functions to reach the origin of two control systemsy′ = f(y, a)andy′ = fh(y, a), both locally controllable in the origin, and letKbe any compact set of points controllable to the origin. If ∥fh−f∥∞≤Ch, then T(x) − Th(x) ≤ CKhα, for all x ∈K, whereαis the exponent of Hö
展开▼