A Proximal Method of Stochastic Gradient for Convex Optimization

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:
‎The Proximal Stochastic Average Gradient (Prox-SAG+) is a primary method used for solving optimization problems that contain the sum of two convex functions. This kind of problem usually arises in machine learning, which utilizes a large amount of data to create component functions from a dataset. A proximal operation is applied to obtain the optimal value due to its appropriate properties. The Prox-SAG+ algorithm is faster than some other methods and has a simpler algorithm than previous ones. Moreover, using this specific operator can help to reassure that the achieved result is optimal. Additionally, it has been proven that the proposed method has an approximately geometric rate of convergence. Implementing the proposed operator makes the method more practical than other algorithms found in the literature. Numerical analysis also confirms the efficiency of the proposed scheme.
Language:
English
Published:
Control and Optimization in Applied Mathematics, Volume:8 Issue: 1, Winter-Spring 2023
Pages:
19 to 32
https://www.magiran.com/p2586806