Compact Neural Architecture Search for Image Classification Using Gravitational Search Algorithm
This paper presents a compact neural architecture search method for image classification using the Gravitational Search Algorithm (GSA). Deep learning, through multi-layer computational models, enables automatic feature extraction from raw data at various levels of abstraction, playing a key role in complex tasks such as image classification. Neural Architecture Search (NAS), which automatically discovers new architectures for Convolutional Neural Networks (CNNs), faces challenges such as high computational complexity and costs. To address these issues, a GSA-based approach has been developed, employing a bi-level variable-length optimization technique to design both micro and macro architectures of CNNs. This approach, leveraging a compact search space and modified convolutional bottlenecks, demonstrates superior performance compared to state-of-the-art methods. Experimental results on CIFAR-10, CIFAR-100, and ImageNet datasets reveal that the proposed method achieves a classification accuracy of 98.48% with a search cost of 1.05 GPU days, outperforming existing algorithms in terms of accuracy, search efficiency, and architectural complexity.