A Proximal Method of Stochastic Gradient for Convex Optimization

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:
‎The Proximal Stochastic Average Gradient (Prox-SAG+) is a primary method used for solving optimization problems that contain the sum of two convex functions. This kind of problem usually arises in machine learning, which utilizes a large amount of data to create component functions from a dataset. A proximal operation is applied to obtain the optimal value due to its appropriate properties. The Prox-SAG+ algorithm is faster than some other methods and has a simpler algorithm than previous ones. Moreover, using this specific operator can help to reassure that the achieved result is optimal. Additionally, it has been proven that the proposed method has an approximately geometric rate of convergence. Implementing the proposed operator makes the method more practical than other algorithms found in the literature. Numerical analysis also confirms the efficiency of the proposed scheme.
Language:
English
Published:
Control and Optimization in Applied Mathematics, Volume:8 Issue: 1, Winter-Spring 2023
Pages:
19 to 32
magiran.com/p2586806  
دانلود و مطالعه متن این مقاله با یکی از روشهای زیر امکان پذیر است:
اشتراک شخصی
با عضویت و پرداخت آنلاین حق اشتراک یک‌ساله به مبلغ 1,390,000ريال می‌توانید 70 عنوان مطلب دانلود کنید!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی نامحدود همه کاربران به متن مطالب تهیه نمایند!
توجه!
  • حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران می‌شود.
  • پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانه‌های چاپی و دیجیتال را به کاربر نمی‌دهد.
In order to view content subscription is required

Personal subscription
Subscribe magiran.com for 70 € euros via PayPal and download 70 articles during a year.
Organization subscription
Please contact us to subscribe your university or library for unlimited access!