mohammad ansari shiri
-
Data dimensions and networks have grown exponentially with the Internet and communications. The challenge of high-dimensional data is increasing for machine learning and data science. This paper presents a hybrid filter-wrapper feature selection method based on Equilibrium Optimization (EO) and Simulated Annealing (SA). The proposed algorithm is named Filter-Wrapper Binary Equilibrium Optimizer Simulated Annealing (FWBEOSA). We used SA to solve the local optimal problem so that EO could be more accurate and better able to select the best subset of features. FWBEOSA utilizes a filtering phase that increases accuracy as well as reduces the number of selected features. The proposed method is evaluated on 17 standard UCI datasets using Support Vector Machine (SVM) and K-Nearest Neighbors (KNN) classifiers and compared with ten state-of-the-art algorithms (i.e., Binary Equilibrium Optimizer (BEO), Binary Gray Wolf Optimization (BGWO), Binary Swarm Slap Algorithm (BSSA), Binary Genetic Algorithm (BGA), Binary Particle Swarm Optimization (BPSO), Binary Social Mimic Optimization (BSMO), Binary Atom Search Optimization (BASO), Modified Flower Pollination Algorithm (MFPA), Bar Bones Particle Swarm Optimization (BBPSO) and Two-phase Mutation Gray Wolf Optimization (TMGWO)). Based on the results of the SVM classification, the highest level of accuracy was achieved in 13 out of 17 data sets (76%), and the lowest number of selected features was achieved in 15 out of 17 data sets (88%). Furthermore, the proposed algorithm using class KNN achieved the highest accuracy rate in 14 datasets (82%) and the lowest selective feature rate in 13 datasets (76%).
Keywords: Feature selection, Equilibrium Optimizer, Simulated Annealing, Filter, Wrapper -
The topic of feature selection has become one of the hottest subjects in machine learning over the last few years. The results of evolutionary algorithm selection have also been promising, along with standard feature selection algorithms. For K-Nearest Neighbor (KNN) classification, this paper presents a hybrid filter-wrapper algorithm based on Equilibrium Optimization (EO). With respect to the selected feature subset, the filter model is based on a composite measure of feature relevance and redundancy. The wrapper model consists of a binary Equilibrium Optimization (BEO). The hybrid algorithm is called filter-based BEO (FBBEO). By combining filters and wrappers, FBBEO achieves a unique combination of efficiency and accuracy. In the experiment, 11 standard datasets from the UCI repository were utilized. Results indicate that the proposed method is effective in improving the classification accuracy and selecting the best optimal features subsets with the least number of features.Keywords: Feature Selection, Classification, Wrapper, filter, Equilibrium Optimization
-
Recent advances in science, engineering, and technology have created massive datasets. As a result, machine learning and data mining techniques cannot perform well on these huge datasets because they contain redundant, noisy, and irrelevant features. The purpose of feature selection is to reduce the dimensionality of datasets by selecting the most relevant attributes while simultaneously increasing classification accuracy. The application of meta-heuristic optimization techniques has become increasingly popular for feature selection in recent years due to their ability to overcome the limitations of traditional optimization methods. This paper presents a binary version of the Manta Ray Foraging Optimizer (MRFO), an alternative optimization algorithm. Besides reducing costs and reducing calculation time, we also incorporated Spearman's correlation coefficient into the proposed method, which we called Correlation Based Binary Manta Ray Foraging (CBBMRF). It eliminates highly positive correlation features at the beginning of the calculation, avoiding additional calculations and leading to faster subset selection. A comparison is made between the presented algorithms and five state-of-the-art meta-heuristics using 10 standard UCI datasets. As a result, the proposed algorithms demonstrate superior performance when solving feature selection problems.Keywords: Feature selection, Optimization, Correlation, Accuracy
- در این صفحه نام مورد نظر در اسامی نویسندگان مقالات جستجو میشود. ممکن است نتایج شامل مطالب نویسندگان هم نام و حتی در رشتههای مختلف باشد.
- همه مقالات ترجمه فارسی یا انگلیسی ندارند پس ممکن است مقالاتی باشند که نام نویسنده مورد نظر شما به صورت معادل فارسی یا انگلیسی آن درج شده باشد. در صفحه جستجوی پیشرفته میتوانید همزمان نام فارسی و انگلیسی نویسنده را درج نمایید.
- در صورتی که میخواهید جستجو را با شرایط متفاوت تکرار کنید به صفحه جستجوی پیشرفته مطالب نشریات مراجعه کنید.