Forward stagewise and Frank Wolfe are popular gradient based projection free optimization algorithms which both require convex constraints. We propose a method to extend the applicability of these algorithms to problems of the form min_x f(x) s.t. g(x) ≤ k where f(x) is an invex (Invexity is a generalization of convexity and ensures that all local optima are also global optima.) objective function and g(x) is a non-convex constraint. We provide a theorem which defines a class of monotone component-wise transformation functions X_i = h(z_i). These transformations lead to a convex constraint function G(z) = g(h(z)). Assuming invexity of the original function f(x) that same transformation x_i = h(z_i) will lead to a transformed objective function F(z) = f(h(z)) which is also invex. For algorithms that rely on a non-zero gradient ▽F to produce new update steps invexity ensures that these algorithms will move forward as long as a descent direction exists.
展开▼