فهرست مطالب

Advances in Computer Research - Volume:9 Issue: 2, Spring 2018

Journal of Advances in Computer Research
Volume:9 Issue: 2, Spring 2018

  • تاریخ انتشار: 1396/08/06
  • تعداد عناوین: 8
|
  • S.M. Mousavi Pages 1-20
    In this paper, two methods named distance and triangle methods are extended to evaluate the quality of approximation of the Pareto set from solving bi-objective problems. In order to use evaluation methods, a bi-objective problem is needed to define. It is considered the problem of scheduling jobs in a hybrid flow shop environment with sequence-dependent setup times and the objectives of minimizing both the makespan and the total tardiness. The bi-objective genetic algorithm in literature is applied to solve this problem belongs to NP-hard class. In the structure of algorithm, 3 and 4 alternatives for dispatching rules and neighborhood search structure have been introduced respectively. Therefore, twelve algorithms are derived from a combination of dispatching rules and neighborhood search structures. After the execution of algorithms, efficient sets are compared through several evaluation methods. Computational results show that the FIFO rule is the best alternative for the dispatching rule in order to find the job sequence for the second to end stages.
    Keywords: Data Envelopment Analysis, Distance method, Triangle method, Bi-objective problem
  • Bahman Botshekan Pages 21-36
    The rapid rise in popularity of multimedia applications, such as VoIP, IPTV and Video Conferencing, intensifies the need to consider resource management for user satisfaction. Furthermore, improving Quality of Experience (QoE) in Software Defined Networks (SDNs) services is one of the important issues to be addressed by provisioning optimum resource management. In this paper, resource allocation in SDN is considered to improve user perceived quality in video applications. To this end, an intelligent learning is presented based on Weighted Fuzzy Petri Net decision algorithm for SDN resource management based on QoE assessment. The efficiency of the proposed system is evaluated in special condition by other applications such as FTP, Email, and data base through simulations. The results show that, through the control plane supervision, the proposed algorithm, improves Quality of Experience by decreasing delay, jitters, and bandwidth while preserving throughput in the network.
    Keywords: Quality of Experience (QoE), Resource Management, Software Defined Network (SDN), Weighted Fuzzy Petri Net (WFPN)
  • Ali Allahverdipour, Farhad Soleimanian Gharehchopogh Pages 37-48
    The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial Intelligence (AI) and optimization algorithms which are highly potential in Feature Selection (FS) and words extraction. In this paper Crow Search Algorithm (CSA) is used for FS and K-Nearest Neighbor (KNN) for classification. Additionally, TF technique is proposed for counting words and calculating the words’ frequency. Analysis is performed on Reuters-21578, Webkb and Cade 12 datasets. The results indicate that the proposed model is more accurate in classification than KNN model and, show greater F-Measure compared to KNN and C4.5. Moreover, by using FS, the proposed model promotes classification accuracy by %27, compared to KNN.
    Keywords: Text Documents Classification, Crow Search Algorithm, K-Nearest Neighbor
  • Kourosh Nemati Pages 49-69
    Early detection of lung nodules is extremely important for the diagnosis and clinical management of lung cancer. In this paper, a novel computer aided detection (CAD) system for the detection of pulmonary nodules in conventional chest radiograph is presented. The proposed approach is based on radial basis function neural network. The massive training radial basis functions (MTRBFNN) is presented for classification between nodule and non-nodule. The MTRBFNN is trained by large number of overlapped sub-regions which are extracted from regional of interest (ROI). The efficiency of the MTRBFNN was assessed by ROC curves. The ROC curve shows the total sensitivity as a function of the number of non-nodules (false positives) at a certain point on the curve per image. When the MTRBFNN was applied, FPs decreased so that at some special operating points on the ROC curve, the reduction was up to 18% (99/550). The MTRBFNN is able to reduce false-positive rate in this paper from 3.93 (550/140) to 0.71 (99/140) false positives per image, and in total, gained a sensitivity of 92% (129/140).
    Keywords: Neural network, RBF, Massive training, CAD
  • Leila Amirifar, Hajar Shafiee Pages 71-89
    As a matter of fact, the estimated loss of the human life caused by the earthquake is a crucial issue. This estimate helped to administrators in planning for the preventive measures or the crisis management after the earthquake. In this article, we represented a new model for measuring the loss of human life utilizing the self-organizing competitive neural networks. In the latest model, the neural network is initially trained via the given trainings of the parameters of the earthquake depth, magnitude, the maximum of the ground acceleration, speed vibration, duration of the earthquake, the intensity of the earthquake, the mechanism of the earthquake and etc. Then, this procedure used to predict and estimate the loss of human life. In fact, the recent educational model was experimented on the region 3 of Esfahan. Nevertheless, the total results can thoroughly show that Isfahan province can be classified as the third class of vulnerability against the earthquake.
    Keywords: Earthquakes, Loss of Human Life, Neural Networks, Estimates, Notions
  • Sepideh Sheivandi, Sima Emadi Pages 91-102
    Web services as independent software components are published on the Internet by service providers and services are then called by users’ request. However, in many cases, no service alone can be found in the service repository that could satisfy the applicant satisfaction. Service composition provides new components by using an interactive model to accelerate the programs. Prior to service composition, the most important issue in finding suitable candidate services samples is their compliance with non-functional requirements. Thus, designing an efficient way to combine a chain of connected services is important. Recently, numerous studies have been done to reduce the search time in finding a service composition. However, many of these methods to examine and investigate all Web services in a Web repository require a long time, which occupy the user's time significantly. This paper provides an approach for automatic quality-aware service composition as well as the users’ preferences in achieving the optimum composition results. For this purpose, modified graph coloring method to filter the data before compositions in large-scale data is used which decreases selected services set. The application of KPL algorithm in this study provided some proper solutions to the user so that these solutions can be used instead of the best composition if necessary. Therefore, the results derived from the analysis of the proposed method, indicates a good optimization in runtime and memory consumption. The evaluation results show that the proposed method in memory consumption and runtime has improved by about 20%.
    Keywords: coloring-based, service composition, Top-K algorithm, quality-aware service, KPL algorithm
  • Yaser Nemati, Pirooz Shamsinejad Pages 103-112
    Our medical world is replete with clinical data but this data is rarely automatically exploited for bringing more health to our society. Many researches have been conducted in Medical Data Mining, but almost all of them have focused on diagnosing the diseases not treating the patients. In this paper we propose the Causality-based Medical Diagnosis and Treatment System, which can be used to diagnose a patient disease and suggest treatments to her/him. Our proposed system has three main subsystems: Causal Network Extractor, Diagnosis Subsystem and Treatment Suggesting Subsystem.
    Two main features of our system are: it takes solely observational data as input data and uses the causality-based action mining methodology. Action Mining is relatively a new trend in Data Mining which aims in proposing more actionable patterns to domain experts. We have implemented and tested our proposed method on some real and synthesized data. The results show superiority of our method over current state of the art method. Taking into account the causality results in more reliable treatments and makes it possible to use this system in real world situations.
    Keywords: Medical Diagnosis System, Automatic Medical Treatment, Action Mining, Causal Networks
  • Mohammadreza Ramezanpour, Reihaneh Khorsand Pages 113-122
    Intra coding in High efficiency video coding (HEVC) can significantly improve the compression efficiency using 35 intra-prediction modes for 2N×2N (N is an integer number ranging from six to two) luma blocks. To find the luma block with the minimum rate-distortion, it must perform 11932 different rate-distortion cost calculations. Although this approach improves coding efficiency compared to the previous standards such as H.264/AVC, but computational complexity is increased significantly. In this paper, an intra-prediction technique has been described to improve the performance of the HEVC standard by minimizing its computational complexity. The proposed algorithm called prediction unit size decision (PUSD) was introduced to decrease evaluation of block sizes. The simulation results show that the time complexity is decreased by ~36% while the bit-rate is increased by 1.1 kbps, and PSNR is decreased by 0.6 db. Accordingly, the proposed algorithms have negligible effect on the video quality with great saving in the time complexity.
    Keywords: block size decision, HEVC, Intra coding, Temporary direction map, Prediction unit