Bidirectional Layer-by-layer Pre-training Method
Author(s):
Abstract:
In this paper, a bidirectional pre-training method for initializing weights of hetero-associative deep neural network was presented. Training of deep neural networks, because of confrontation with a large number of local minima, is not often converged. This is while through proper initializing weights instead of random values at the beginning of the training; it is possible to avoid many local minima. The bidirectional layer-by-layer pre-training method pre-train weights in forward and backward manners in parallel. Afterwards, the weight values resulted from their training are applied in the deep neural network. The bidirectional layer-by-layer pre-training was applied for pre-training of the classifier deep neural network weights, and revealed that both the training speed and the recognition rate were improved in Bosphorus and CK+ databases.
Keywords:
Language:
Persian
Published:
Intelligent Systems in Electrical Engineering, Volume:6 Issue: 2, 2015
Pages:
1 to 10
magiran.com/p1473505
دانلود و مطالعه متن این مقاله با یکی از روشهای زیر امکان پذیر است:
اشتراک شخصی
با عضویت و پرداخت آنلاین حق اشتراک یکساله به مبلغ 1,390,000ريال میتوانید 70 عنوان مطلب دانلود کنید!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی نامحدود همه کاربران به متن مطالب تهیه نمایند!
توجه!
- حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران میشود.
- پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانههای چاپی و دیجیتال را به کاربر نمیدهد.
In order to view content subscription is required
Personal subscription
Subscribe magiran.com for 70 € euros via PayPal and download 70 articles during a year.
Organization subscription
Please contact us to subscribe your university or library for unlimited access!