Applying of Zhang neural network in time-varying nonlinear function optimization

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:
Introduction

Optimization of nonlinear time-varying functions, as a subset of nonlinear programming, has been widely observed in various economic and engineering models. In energy management, one example of optimizing nonlinear functions with time-variable components is the efficient allocation of energy resources and managing changes in demand and supply, leading to increased efficiency and reduced energy waste. In this article, we intend to use Zhang neural networks for optimizing nonlinear functions with time-varying components. By harnessing the parallel processing power of neural networks, Zhang networks search the solution space faster than traditional methods, significantly reducing the required computational time.

Method

In this research, the proposed neural network receives data using MATLAB software. The data is first standardized using standard normalization methods. The data is then divided into four stages: training, testing, experimenting and validation which are further evaluated in five phases. The training data is based on the Luenberger-Madala algorithm for the first layer and a linear function for the second layer. Subsequently, the best network structure is considered with the transformation function and the proposed neural network model is tested in five stages. In this research, the Taylor series is used for data normalization and the zero-stability model of the n discrete time method is used to calculate error which reduces the error. The data in this research examined and evaluated in four stages of training, test, test and validation. Data training is based on Lunberg-Maud algorithm model for the first layer and linear function for the second layer. The reason for using Lunberg-Maud algorithm for research analysis is its convergence speed and higher efficiency due to not being in local minima and small error level.

Results

The best network structure with transformation function was considered and tested in 5 steps based on the proposed neural network model. The mean square of error in the third and fourth experiments has gradually increased compared to the first two stages. This amount of difference in the performance error as well as the coefficient of determination is different in each iteration and is caused by getting stuck in local minima.

Discussion

Due to the results obtained in the five test stages, it can be said that the algorithm based on the proposed neural network improves the performance of the network by increasing the learning rate. However, this algorithm is highly sensitive to local minima. This problem exists even when the learning rate is small and therefore the step of the algorithm is small. To avoid this sensitivity to local minima, the algorithm used in the proposed network was tested with momentum with different learning rates in five stages and the best result was evaluated. Also, at each stage, the process of training, testing and validation was evaluated separately.

Language:
Persian
Published:
Journal of Intelligent Multimedia Processing and Communication Systems, Volume:4 Issue: 2, 2024
Pages:
31 to 42
magiran.com/p2710256  
دانلود و مطالعه متن این مقاله با یکی از روشهای زیر امکان پذیر است:
اشتراک شخصی
با عضویت و پرداخت آنلاین حق اشتراک یک‌ساله به مبلغ 1,390,000ريال می‌توانید 70 عنوان مطلب دانلود کنید!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی نامحدود همه کاربران به متن مطالب تهیه نمایند!
توجه!
  • حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران می‌شود.
  • پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانه‌های چاپی و دیجیتال را به کاربر نمی‌دهد.
In order to view content subscription is required

Personal subscription
Subscribe magiran.com for 70 € euros via PayPal and download 70 articles during a year.
Organization subscription
Please contact us to subscribe your university or library for unlimited access!