فهرست مطالب

Computing and Security - Volume:1 Issue: 3, Summer 2014

Journal of Computing and Security
Volume:1 Issue: 3, Summer 2014

  • تاریخ انتشار: 1393/05/25
  • تعداد عناوین: 6
|
  • Siavash Khodambashi*, Ali Zakerolhosseini Pages 179-186
    Recently, the laws of quantum physics have amazed classical cryptography and aided researchers to provide secure communications in presence of adversaries. In this paper we present a novel weak blind signature scheme whose security is guaranteed by fundamental principles of quantum physics. Despite previous schemes which are taking advantage of quantum entangled states, our proposed quantum blind signature relies only on Quantum Key Distribution (QKD) protocol. We show throughout this paper that our proposed quantum weak blind signature achieves a good position in security and reliability. In addition, it is feasible for the proposed scheme to become commercialized with current technology. Hence, it can be widely used for e-payment, e-government, e-business and etc.
    Keywords: Quantum Weak Blind Signature, Quantum Key Distribution, Quantum Cryptography, Quantum Paymen
  • Omid Fallah, Mehrjardi*, Behrouz Shahgholi Ghahfarokhi, Hamid Mala, Naser Movahhedinia Pages 187-204
    Femtocells are widely used to improve poor indoor coverage and decrease the cellular networks high cost. The fact that femtocells and macrocells share similar frequency bands leads to a major challenge in time/frequency resources allocation. Addressing fairness among femtocells and improving time/frequency resources utilization are the main objectives of the previous studies. The balance between femtocell level fairness and utilization is considered in some of the previous studies. Providing user level fairness with respect to User Equipments (UEs) demands is the issue that has not received adequate attention so far. Here, a centralized resource allocation algorithm is proposed to improve the balance between user level fairness and radio resource utilization where the demand of UEs for radio resources are involved. In this algorithm, two independent phases are followed. The first phase assigns resources to femtocells in a greedy manner to increase the reused spectrum utilization based on the proposed priority. The second phase is planned in a manner to guarantee the fairness between UEs by considering their required resources whenever the residual resources are not proportional to the unmet requirements. Compared to the conventional methods, this proposed approach leads to an improvement in terms of utilization, fairness, and average amount of satisfied demands.
    Keywords: Femtocell, Resource Allocation, User Demand, Fairness, Utilization
  • Farideh Halakou*, Mahdi Eftekhari, Ali Esmailizadeh Pages 205-213
    During the last decade, applying feature selection methods in bioinformatics has become an essential necessity for model building. This is due to the high dimensional nature of many modeling tasks in bioinformatics of them being Single Nucleotide Polymorphisms (SNPs) selection. In this paper, we propose three hybrid feature selection methods named CNNFS, Ck-NNFS, and CRRFS, which are combinations of filter and wrapper techniques. In our methods, filter techniques were applied to remove the irrelevant/redundant features as the first step. Then in the second step, wrapper techniques were exploited to refine the primary feature subset obtained from the first step. Neural Network, k-Nearest Neighbor, and Ridge Regression were injected in the wrapper phase as induction algorithms. Since pure wrapper methods take a long time to run on high dimensional data, we compared our methods with three well-known filter methods, and skipped the wrappers. The results vividly show the performance of hybrid methods in addition to their dimensionality reduction ability in SNPs selection. The CRRFS algorithm brought the most satisfactory results regarding to the precision of recognizing candidate SNPs, and the recall of them in the final SNPs subset.
    Keywords: Feature Selection, Single Nucleotide Polymorphisms, Neural Network, K, Nearest Neighbor, Ridge Regression
  • Azam Amouzadi *, Abdolreza Mirzaei Pages 215-224
    The main objective of this article is to improve the accuracy of Mamdani fuzzy rule-based classification systems. Although these systems tend to perform successfully with respect to interpretability, they suffer from rigid pattern space partitioning. Therefore, a new hierarchical fuzzy rule-based classifier based on binary-tree decomposition is proposed here to develop a more flexible pattern space partitioning. The decomposition process is controlled by fuzzy entropy of each partition. Final rule sets obtained by this proposed method are pruned to overcome the over fitting problem. The performance of this method is compared with some fuzzy and non-fuzzy classification methods on a set of bench mark classification tasks. The experimental results indicate a good performance of the proposed algorithm.
    Keywords: Mamdani Fuzzy Rule, based Classification Systems, Hierarchical Fuzzy Rules, Fuzzy Entropy
  • Mahdie Rezaeian *, Rasoul Amirfattahi, Saeid Sadri Pages 225-238
    This paper presents a semantic method for aerial image segmentation. Multi-class aerial images are often featured with large intra-class variations and inter-class similarities. Furthermore, shadows, reflections and changes in viewpoint, high and varying altitude and variability of natural scene pose serious problems for simultaneous segmentation. The main purpose of segmentation of aerial images is to make subsequent recognition phase straightforward. Present algorithm combines two challenging tasks of segmentation and classification in a manner that no extra recognition phase is needed. This algorithm is supposed to be part of a system which will be developed to automatically locate the appropriate site for Unmanned Aerial Vehicle (UAV) landing. With this perspective, we focused on segregating natural and man-made areas in aerial images. We compared different classifiers and explored the best set of features for this task in an experimental manner. In addition, a certainty based method has been used for integrating color and texture descriptors in a more efficient way. The experimental results over a dataset comprised of 25 high-resolution images show the overall binary segmentation accuracy rate of 91.34%.
    Keywords: Aerial Images, Semantic Segmentation, Classification, Local Binary Patterns, Feature Fusion, Artificial Neural Network, Support Vector Machine, Random Forest
  • Bagher Saberi, Nasser Ghadiri* Pages 239-249
    Spatial data is playing an emerging role in new technologies such as web and mobile mapping and Geographic Information Systems (GIS). Important decisions in political, social and many other aspects of modern human life are being made using location data. Decision makers in many countries are exploiting spatial databases for collecting information, analyzing them and planning for the future. In fact, not every spatial database is suitable for this type of application. Inaccuracy, imprecision and other deficiencies are present in location data just as any other type of data and may have a negative impact on credibility of any action taken based on unrefined information. So we need a method for evaluating the quality of spatial data and separating usable data from misleading data which leads to weak decisions. On the other hand, spatial databases are usually huge in size and therefore working with this type of data has a negative impact on efficiency. To improve the efficiency of working with spatial big data, we need a method for shrinking the volume of data. Sampling is one of these methods, but its negative effects on the quality of data are inevitable. In this paper we are trying to show and assess this change in quality of spatial data that is a consequence of sampling. We used this approach for evaluating the quality of sampled spatial data related to mobile user trajectories in China which are available in a well-known spatial database. The results show that sample-based control of data quality will increase the query performance significantly, without losing too much accuracy. Based on these results some future improvements are pointed out which will help to process location-based queries faster than before and to make more accurate location-based decisions in limited times.
    Keywords: Algebraic Operations, Data Integration, Quality Metrics, Spatial Big Data, Location Data