Representing a method to identify and contrast with the fraud which is created by robots for developing websites’ traffic ranking
With the expansion of the Internet and the Web, communication and information gathering between individual has distracted from its traditional form and into web sites. The World Wide Web also offers a great opportunity for businesses to improve their relationship with the client and expand their marketplace in online world. Businesses use a criterion called traffic ranking to determine their site's popularity and visibility. Traffic ranking measures the amount of visitors to a site and based on these statistics, allocates a ranking to the site. One of the most important challenges in the ranking is the creation of fake traffic that generated by applications called robots. Robots are malicious software components that used to generate spam, set up distributed denial of services attacks, fishing, identity theft, removal of information and other illegal activities .there are already several ways to identify and discover the robot. According to Doran et al., The identification methods are divided into two categories: offline and real-time. The offline detection method is divided into three categories: Syntactical Log Analysis, Traffic Pattern Analysis, and Analytical Learning Techniques. The real-time method is performed by the Turing test system. In this research, the identification of robots is done through the offline method by analysis and processing of access logs to the web server and the use of data mining techniques. In this method, first, the features of each session are extracted, then generally these sessions are labeled with three conditions into two categories of human and robot. Finally, by using data mining tool, web robots are detected. In all previous studies, the features are extracted from each sessions, for example in first studies, Tan&Kumar extracted 25 features of sessions. After that Bomhardt et al. used 34 features to identify the robots. In 2009 Stassopoulou et al. used 6 features that was extracted from sessions and so on. But in this research, features are extracted from sessions of a unique user. Experimental results show that the proposed method in this research, by discovering new features and introducing a new condition in session labeling, improves the accuracy of identifying robots and moreover, improves the ranking of web traffic from previous work.
-
Estimate the Location of a Breast Tumor using the Received Signal Angle
Javad Nouri Pour, MohammadAli Pourmina *, MohamadNaser Moghaddasi, Behbod Ghalamkari
Majlesi Journal of Telecommunication Devices, Mar 2023 -
Improving the Mean Time to Failure of the System with the New Architecture of the Main Node with the Replacement Node of Industrial Wireless Sensor Networks for Monitoring and Control using Markov Model
Ahmadreza Zamani, MohammadAli Pourmina, Ramin Shaghaghi Kandovan
Majlesi Journal of Telecommunication Devices, Sep 2022