فهرست مطالب

Journal of Artificial Intelligence and Data Mining
Volume:8 Issue: 3, Summer 2020

  • تاریخ انتشار: 1399/05/22
  • تعداد عناوین: 10
|
|
  • N. Mobaraki, R. Boostani *, M. Sabeti Pages 303-312

    Among variety of meta-heuristic population-based search algorithms, particle swarm optimization (PSO) with adaptive inertia weight (AIW) has been considered as a versatile optimization tool, which incorporates the experience of the whole swarm into the movement of particles. Although the exploitation ability of this algorithm is great, it cannot comprehensively explore the search space and may be trapped in a local minimum through a limited number of iterations. To increase its diversity as well as enhancing its exploration ability, this paper inserts a chaotic factor, generated by three chaotic systems, along with a perturbation stage into AIW-PSO to avoid premature convergence, especially in complex nonlinear problems. To assess the proposed method, a known optimization benchmark containing nonlinear complex functions was selected and its results were compared to that of standard PSO, AIW-PSO and genetic algorithm (GA). The empirical results demonstrate the superiority of the proposed chaotic AIW-PSO to the counterparts over 21 functions, which confirms the promising role of inserting the randomness into the AIW-PSO. The behavior of error through the epochs show that the proposed manner can smoothly find proper minimums in a timely manner without encountering with premature convergence.

    Keywords: PSO-AIW, randomness, chaotic factor, swarm experience, convergence rate
  • A. Salehi, B. Masoumi * Pages 313-329

    Biogeography-Based Optimization (BBO) algorithm has recently been of great interest to researchers for simplicity of implementation, efficiency, and the low number of parameters. The BBO Algorithm in optimization problems is one of the new algorithms which have been developed based on the biogeography concept. This algorithm uses the idea of animal migration to find suitable habitats for solving optimization problems. The BBO algorithm has three principal operators called migration, mutation and elite selection. The migration operator plays a very important role in sharing information among the candidate habitats. The original BBO algorithm, due to its poor exploration and exploitation, sometimes does not perform desirable results. On the other hand, the Edge Assembly Crossover (EAX) has been one of the high power crossovers for acquiring offspring and it increased the diversity of the population. The combination of biogeography-based optimization algorithm and EAX can provide high efficiency in solving optimization problems, including the traveling salesman problem (TSP). This paper proposed a combination of those approaches to solve traveling salesman problem. The new hybrid approach was examined with standard datasets for TSP in TSPLIB. In the experiments, the performance of the proposed approach was better than the original BBO and four others widely used metaheuristics algorithms.

    Keywords: Biogeography-Based Optimization, Evolutionary Algorithms, Traveling Salesman Problem
  • M. Abdollahi *, M. Aliyari Shoorehdeli Pages 331-341

    There are various automatic programming models inspired by evolutionary computation techniques. Due to the importance of devising an automatic mechanism to explore the complicated search space of mathematical problems where numerical methods fails, evolutionary computations are widely studied and applied to solve real world problems. One of the famous algorithm in optimization problem is shuffled frog leaping algorithm (SFLA) which is inspired by behaviour of frogs to find the highest quantity of available food by searching their environment both locally and globally. The results of SFLA prove that it is competitively effective to solve problems. In this paper, Shuffled Frog Leaping Programming (SFLP) inspired by SFLA is proposed as a novel type of automatic programming model to solve symbolic regression problems based on tree representation. Also, in SFLP, a new mechanism for improving constant numbers in the tree structure is proposed. In this way, different domains of mathematical problems can be addressed with the use of proposed method. To find out about the performance of generated solutions by SFLP, various experiments were conducted using a number of benchmark functions. The results were also compared with other evolutionary programming algorithms like BBP, GSP, GP and many variants of GP.

    Keywords: Genetic Programming, Shuffled Frog Leaping Algorithm, Shuffled Frog Leaping Programming, Regression Problems
  • M. Zeynali, H. Seyedarabi *, B. Mozaffari Tazehkand Pages 343-356

    Network security is very important when sending confidential data through the network. Cryptography is the science of hiding information, and a combination of cryptography solutions with cognitive science starts a new branch called cognitive cryptography that guarantee the confidentiality and integrity of the data. Brain signals as a biometric indicator can convert to a binary code which can be used as a cryptographic key. This paper proposes a new method for decreasing the error of EEG- based key generation process. Discrete Fourier Transform, Discrete Wavelet Transform, Autoregressive Modeling, Energy Entropy, and Sample Entropy were used to extract features. All features are used as the input of new method based on window segmentation protocol then are converted to the binary mode. We obtain 0.76%, and 0.48% mean Half Total Error Rate (HTER) for 18-channel and single-channel cryptographic key generation systems respectively.

    Keywords: Cryptography, Electroencephalogram (EEG), Security, Biometric cryptosystem
  • M. Kakooei, Y. Baleghi * Pages 357-370

    Semantic labeling is an active field in remote sensing applications. Although handling high detailed objects in Very High Resolution (VHR) optical image and VHR Digital Surface Model (DSM) is a challenging task, it can improve the accuracy of semantic labeling methods. In this paper, a semantic labeling method is proposed by fusion of optical and normalized DSM data. Spectral and spatial features are fused into a Heterogeneous Feature Map to train the classifier. Evaluation database classes are impervious surface, building, low vegetation, tree, car, and background. The proposed method is implemented on Google Earth Engine. The method consists of several levels. First, Principal Component Analysis is applied to vegetation indexes to find maximum separable color space between vegetation and non-vegetation area. Gray Level Co-occurrence Matrix is computed to provide texture information as spatial features. Several Random Forests are trained with automatically selected train dataset. Several spatial operators follow the classification to refine the result. Leaf-Less-Tree feature is used to solve the underestimation problem in tree detection. Area, major and, minor axis of connected components are used to refine building and car detection. Evaluation shows significant improvement in tree, building, and car accuracy. Overall accuracy and Kappa coefficient are appropriate.

    Keywords: VHR Semantic labeling, Spatial feature, Google Earth Engine, GLCM, Random Forest
  • M. Salehi, J. Razmara *, Sh. Lotfi Pages 371-378

    Prediction of cancer survivability using machine learning techniques has become a popular approach in recent years. ‎In this regard, an important issue is that preparation of some features may need conducting difficult and costly experiments while these features have less significant impacts on the final decision and can be ignored from the feature set‎. ‎Therefore‎, ‎developing a machine for prediction of survivability‎, ‎which ignores these features for simple cases and yields an acceptable prediction accuracy‎, ‎has turned into a challenge for researchers‎. ‎In this paper‎, ‎we have developed an ensemble multi-stage machine for survivability prediction which ignores difficult features for simple cases‎. ‎The machine employs three basic learners‎, ‎namely multilayer perceptron (MLP), ‎ support vector machine (SVM), and decision tree (DT)‎, ‎in the first stage to predict survivability using simple features‎. ‎If the learners agree on the output‎, ‎the machine makes the final decision in the first stage‎. Otherwise, ‎for difficult cases where the output of learners is different‎, ‎the machine makes decision in the second stage using SVM over all features‎. The developed model was evaluated using the Surveillance, Epidemiology, and End Results (SEER) database. The experimental results revealed that ‎the developed machine obtains considerable accuracy while it ignores difficult features for most of the input samples‎‎.

    Keywords: breast cancer survivability prediction, Ensemble learning, multi-stage machines, Feature Selection
  • B. Hassanpour, N. Abdolvand *, S. Rajaee Harandi Pages 379-389

    The rapid development of technology, the Internet, and the development of electronic commerce have led to the emergence of recommender systems. These systems will assist the users in finding and selecting their desired items. The accuracy of the advice in recommender systems is one of the main challenges of these systems. Regarding the fuzzy systems capabilities in determining the borders of user interests, it seems reasonable to combine it with social networks information and the factor of time. Hence, this study, for the first time, tries to assess the efficiency of the recommender systems by combining fuzzy logic, longitudinal data and social networks information such as tags, friendship, and membership in groups. And the impact of the proposed algorithm for improving the accuracy of recommender systems was studied by specifying the neighborhood and the border between the users’ preferences over time. The results revealed that using longitudinal data in social networks information in memory-based recommender systems improves the accuracy of these systems.

    Keywords: Recommender system, Social Network, Longitudinal Data, fuzzy logic, Tags
  • V. Ghasemi *, M. Javadian, S. Bagheri Shouraki Pages 391-407

    In this work, a hierarchical ensemble of projected clustering algorithm for high-dimensional data is proposed. The basic concept of the algorithm is based on the active learning method (ALM) which is a fuzzy learning scheme, inspired by some behavioral features of human brain functionality. High-dimensional unsupervised active learning method (HUALM) is a clustering algorithm which blurs the data points as one-dimensional ink drop patterns, in order to summarize the effects of all data points, and then applies a threshold on the resulting vectors. It is based on an ensemble clustering method which performs one-dimensional density partitioning to produce ensemble of clustering solutions. Then, it assigns a unique prime number to the data points that exist in each partition as their labels. Consequently, a combination is performed by multiplying the labels of every data point in order to produce the absolute labels. The data points with identical absolute labels are fallen into the same cluster. The hierarchical property of the algorithm is intended to cluster complex data by zooming in each already formed cluster to find further sub-clusters. The algorithm is verified using several synthetic and real-world datasets. The results show that the proposed method has a promising performance, compared to some well-known high-dimensional data clustering algorithms.

    Keywords: Ensemble Clustering, High Dimensional Clustering, Hierarchical Clustering, Unsupervised Active Learning Method
  • A.R. Tajary *, E. Tahanian Pages 409-415

    Wireless network on chip (WiNoC) is one of the promising on-chip interconnection networks for on-chip system architectures. In addition to wired links, these architectures also use wireless links. Using these wireless links makes packets reach destination nodes faster and with less power consumption. These wireless links are provided by wireless interfaces in wireless routers. The WiNoC architectures differ in the position of the wireless routers and how they interact with other routers. So, the placement of wireless interfaces is an important step in designing WiNoC architectures. In this paper, we propose a simulated annealing (SA) placement method which considers the routing algorithm as a factor in designing cost function. To evaluate the proposed method, the Noxim, which is a cycle-accurate network-on-chip simulator, is used. The simulation results show that the proposed method can reduce flit latency by up to 24.6% with about a 0.2% increase in power consumption.

    Keywords: Simulated annealing, Wireless Network on Chip, Placement
  • Gh. Ahmadi *, M. Teshnelab Pages 417-425

    Because of the existing interactions among the variables of a multiple input-multiple output (MIMO) nonlinear system, its identification is a difficult task, particularly in the presence of uncertainties. Cement rotary kiln (CRK) is a MIMO nonlinear system in the cement factory with a complicated mechanism and uncertain disturbances. The identification of CRK is very important for different purposes such as prediction, fault detection, and control. In the previous works, CRK was identified after decomposing it into several multiple input-single output (MISO) systems. In this paper, for the first time, the rough-neural network (R-NN) is utilized for the identification of CRK without the usage of MISO structures. R-NN is a neural structure designed on the base of rough set theory for dealing with the uncertainty and vagueness. In addition, a stochastic gradient descent learning algorithm is proposed for training the R-NNs. The simulation results show the effectiveness of proposed methodology.

    Keywords: Cement Rotary Kiln, Rough-Neural Network, Stochastic Gradient Descent Learning, System Identification, Uncertainty