Whitened gradient descent, a new updating method for optimizers in deep neural networks
Optimizers are vital components of deep neural networks that perform weight updates. This paper introduces a new updating method for optimizers based on gradient descent, called whitened gradient descent (WGD). This method is easy to implement and can be used in every optimizer based on the gradient descent algorithm. It does not increase the training time of the network significantly. This method smooths the training curve and improves classification metrics. To evaluate the proposed algorithm, we performed 48 different tests on two datasets, Cifar100 and Animals-10, using three network structures, including densenet121, resnet18, and resnet50. The experiments show that using the WGD method in gradient descent based optimizers, improves the classification results significantly. For example, integrating WGD in RAdam optimizer increased the accuracy of DenseNet from 87.69% to 90.02% on the Animals-10 dataset.
-
Fire and Smoke Segmentation using FireNet Combined with UNet3+
A. Eskandari *, H. Khosravi
International Journal of Engineering, Oct 2025 -
Estimation of Electrical Efficiency of Photovoltaic Panels with Methods Based on Deep Learning Using Image
S. M. Javadimoghaddam *, H. Gholamalinejad, A. Noroozi, M. H. Abdi, H. Mortezapour
Journal of Electrical Engineering, -
Real-time vehicle type recognition using a convolution and Haar wavelet pooling based classifier
Seyyed Mohammad Javadimoghaddam*,
Journal of Control, -
A novel approach for vehicle identification based on image registration and deep learning
R. Asgarian Dehkordi, H. Khosravi *, H. Asgarian Dehkordi, M. Sheyda
Scientia Iranica, Mar-Apr 2024