فهرست مطالب

Information Systems and Telecommunication - Volume:1 Issue: 0, Oct-Dec 2012

Journal of Information Systems and Telecommunication
Volume:1 Issue: 0, Oct-Dec 2012

  • تاریخ انتشار: 1391/10/01
  • تعداد عناوین: 8
|
  • Mostafa Kazemi, Mohammad Lagzian, Gholamreza Malekzadeh, Samira Pour Page 1
    Organizational Intelligence and Business Intelligence shape the organizational rules and resources. The information processing capability of the organization as a main factor of Organizational intelligence is influenced and guided by pre existing rules and resources. Structures act as information filtering mechanisms that process the information consistent with the current organizational schemas. Also, the people who control the resources strongly influence the information processing capability of the organizations and this capability is determined by top management who are a group of people steering the organization. This paper discusses the perceptions of organizational intelligence and its dimensions in Iranian Universities, taking into account several dimensions and components. As the main objective of this research is to outline the Dimensions and Components of Organizational Intelligence in Universities so an expert panel opinions was used and a conceptual model proposed. In order to assess the ability of the organization for the learning process and its intelligence coefficient, a conceptual model was proposedand tested. This model consists of eight Dimensions which must be monitored which are: structural, cultural, strategic, communicational, informational, functional, behavioral, and environmental dimensions. Each dimension consists of some components (36 components). So based on these dimensions and components, Organizational Intelligence profile of university can be calculated.
    Keywords: Organizational Intelligence, Information System, Decision Making System, Delphi method, Information Processing capability, Structural Equation Model
  • Sara Nazari, Mohammad Shahram Moin Page 19
    In this paper, A novel approach of cover selection for steganography is proposed based on image texture features and human visual system. Our proposed algorithm employs run length matrix to extract some proper images from a huge image database by using image texture features and creates stego version of selected images after embedding process. Then, it computes similarity between original proper selected images and their stego versions by using structural similarity as image quality metric to select one image with maximum similarity to its stego as the best cover. According to the results of composing our new proposed cover selection algorithm with other steganography methods, it is confirmed that the proposed algorithm is able to increase the stego quality. We also evaluate the robustness of our algorithm over steganalysis methods such as Wavelet based and Block based steganalyses; The experimental results show that the proposed approach is greatly valuable in decreasing recognition risk of hidden information.
    Keywords: Steganography, Cover Selection, Run length matrix, Image texture features, SSIM
  • Mohammad Haji Seyed Javadi, Hamid Reza Mahdiani, Esmaeil Zeinali Kh Page 29
    from significant drawbacks such as limited dynamic range. A fixed-point hardware does not provideacceptable accuracy levels when simultaneously deals with large and small numbers. Although thefloating-point arithmetic greatly addresses this problem, it is not widely used because it faces importantchallenges when realizes in hardware. A novel computational paradigm named as Dynamic Fixed-Point(DFP) is proposed in this paper which provides improved precision levels while has a similar VLSIimplementation costs when compared to traditional fixed-point. The accuracy simulation results andVLSI synthesis costs of the new method is presented and compared with fixed-point to prove itsefficiency.
    Keywords: Finite Precision Arithmetic, Fixed, Point, Floating, Point, VLSI
  • S. Ghazi, Maghrebi Member Ieee, B. Haji Bagher Naeeni, M. Lotfizad Page 35
    methods for increasing the efficiency of such application systems. For this propose, we study the possibility of improving the OFDM employing sliced multi-modulus algorithm (S-MMA) equalization. In this paper, we apply the LMS, MMA and S-MMA equalizations to the per-tone equalization in the OFDM modulation. The contribution lies in using the S-MMA technique, for weight adaptation, to decreasing the BER in the OFDM modulation. The simulation results show that applying standard channels with AWGN noise and ISI impairment, the S-MMA gives a better performance compared to LMS and MMA algorithms. Both analysis and simulations demonstrate the advantage of using the proposed equalization over the well-known LMS equalization. Therefore, the S-MMA equalization is a good choice for high speed and real-time applications such as OFDM.
    Keywords: Constellation, Cyclic prefix, ICI, ISI, OFDM
  • Mohammad Reza Gholamian, Kamran Shahanaghi, Seyed Mahdi Sadatrasoula, Zeynab Hajimohammadi Page 41
    problem such as support vector machines, neural networks and rule based classifiers. Rule bases are more favorite in credit decision making because of their ability to explicitly distinguish between good and bad applicants in a credit scoring context, imbalanced data sets frequently occur as the number of good loans in a portfolio is usually much higher than the number of loans that default. This paper explores the suitability of RIPPER, One R, Decision table, PART and C4.5 for loan default prediction rule extraction. A real database of one of Iranian banks export loans is used and, class imbalance issues is investigated in its loan database by randomly Oversampling the minority class of defaulters, and three times under sampling of majority of non-defaulters class. The performance criterion chosen to measure this effect is the area under the receiver operating characteristic curve (AUC), accuracy measure andnumber of rules. Friedman‟s statistic is used to test for significance differences between techniques anddatasets. The results from study show that PART is the best classifier in all of balanced and imbalanceddatasets.
    Keywords: Scoring, Banking Industry, Rule extraction, Imbalanced data, Sampling
  • Hannane Mahdavinataj, Babak Nasersharif Page 51
    In pattern recognition, feature transformation methods transform features to a new space with aim of obtaining more discriminative or orthogonal features and so more separable classes. In this paper, a linear feature transformation method is presented which considers both of class discrimination and features orthogonality criteria, simultaneously. This feature transformation is obtained using genetic algorithm where its fitness function is determined using mentioned criteria. For feature discrimination criterion, Dunn index is used. On the other hand, for feature orthogonality and independency, covariance matrix and ratio of sum of its diagonal elements to sum of non-diagonal elements is used. We use these criteria to determine the genetic algorithm fitness function. Experiments on UCI dataset show that the proposed feature transformation performs better than or as well as other known linear transformation methods.
    Keywords: Feature Transformation, Covariance Matrix, Dunn Index, Genetic Algorithm
  • Hoda Banki, Seyed Morteza Babamir, Azam Farokh, Mohammad Mehdi Morovati Page 57
    This research shows the influence of using multi-core architecture in decrease of execution time and increase of performance of two software fault tolerance techniques: (1) N-version Programming and (2)Consensus Recovery Block, which have superiority over other software fault tolerance techniques. Accordingly, our implementations were performed based on these two methods. Because the comparison between these two methods showed that the Consensus Recovery Block is more reliable, we proposed a technique named Improved Consensus Recovery Block technique for improving the technique performance. The proposed technique both includes higher performance and has reliability same as the consensus recovery block technique. The performance improvement is based on multi-core architecture where each core is responsible for the execution of a version of software key units. As a result, by concurrent execution of versions, the total execution time is decreased leading to performance improvement.
    Keywords: Software Fault Tolerance, Multi, core, Concurrent Execution, Consensus Recovery Block, N, version Programming, Acceptance Test
  • Khosro Rezaee, S. Jalal Mousavirad, Mohammad Rasegh Ghezelbash, Javad Haddania Page 67
    Smart and timely detection of fire can be very useful in coping with this phenomenon and its inhibition. Enhancing some image analysis methods such as converting RGB image to HSV image, smart selecting the threshold in fire separation, Gaussian mixture model, forming polygon the enclosed area resulted from edge detection and its combination with original image, this papers addresses fire detection. Accuracy and precision in performance and rapid detection of fire are among the features that distinguish this proposed system from similar fire detection systems such as Markov model, GM, DBFIR and other algorithms introduced in valid articles. The average accuracy (95%) resulted from testing 35000 frames in different fire environments and the high sensitivity (96%) was quite significant. Not only can this system be regarded as a reliable suitable alternative for the sensory set used in residential areas, but also the high speed image processing and accurate detection of fire in wide areasmakes it low cost, reliable and appropriate.
    Keywords: Fire detection, Gaussian Mixture Model, image processing, HSV Space, edge detection