Academic Editor: Youssef EL FOUTAYENI
Received |
Accepted |
Published |
January 31, 2021 |
February 15, 2021 |
March 15, 2021 |
Abstract: The Frank-Wolfe method (a.k.a. conditional gradient method) for smooth optimization has been of great interest in recent years in the context of large-scale optimization and machine learning. A key advantage of the method is that it avoids projections. The Frank-Wolfe algorithm requires a long execution time if the accuracy level is very small. In this work, we present a random perturbation of Frank-Wolfe method for solving nonconvex differentiable programming under linear differentiable constraints. The perturbation avoids convergence to local minima. Theoretical results guarantee the ...