It is well known that Newton's method for optimization and nonlinear systems is very well behaved when the approximate Hessian or Jacobian has bounded condition number and is Lipschitz continuous. However, when the approximate Hessian or Jacobian is singular on the domain, much less is known about the behavior of Newton's method.;This thesis builds upon existing theory to formalize more precisely when Newton's method for optimization is expected to be globally convergent when the approximate Hessian is nearly singular. Furthermore, it provides additional context for understanding when Newton's method for nonlinear systems is likely to fail when the Jacobian is singular.;In an effort to mitigate the issues of Newton's method with a line search for nonlinear systems, this work introduces a new approach using inspiration from trust region algorithms for optimization. This approach is shown to both reduce the rate of failure when compared to competing methods and perform well on a battery of test problems.
展开▼