Detection of Obstacle Regions Around an MAV using an Expansion-based Technique

Author(s):
Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:
Introduction

Micro-Aerial Vehicles (MAVs) are ideal platforms for indoor and outdoor applications because to their small size and light weight [1, 2]. Obstacles, on the other hand, may cause MAVs to crash. Cameras collect a lot of evidence about their surroundings. Through the use of grayscale values [3], point features [4], and edge details [5], vision-based techniques detect obstacles. They are divided into monocular and stereo types. There are four types of monocular approaches: appearance-based [6], motion-based [7], depth-based [8], and expansion-based [4]. Expansion-based techniques are based on the same principle as human vision. The majority of expansion-based systems identify obstacles by recognizing object points [4, 5, 9-11]. However, relying solely on points may not be sufficient; as a result, the MAV may collide with unseen impediments. To overcome this obstacle, we describe a novel technique based on the same concept but employing region-enlarging rates.

Methodology

Various steps of our obstacle detection technique above can be summarised in (a) data acquisition and preparation, (b) region extraction and their area calculation, and (c) obstacle detection.

Results and discussion

 We took four pairs of images with an LG 360 CAM fisheye camera, two with the camera moving forward and two with the camera moving to the sides. In the forward direction, recall accuracy is 82% for the first data and 52% for the second data. The new technique detects only a portion of the obstacle region. This problem emerges because some regions lack at least three matching points. While moving to the right, recall accuracy for the third and fourth data is 69% and 39%, respectively. This accuracy is lower in the fourth data set than in the other data sets due to the aforementioned description, the absence of at least three corresponding points, and the possibility of inaccurate corresponding points, particularly along the fisheye image's edges, which have a low quality in these places.

Conclusion

 The proposed method extracts regions of close obstacles from outdoor fisheye images. The findings demonstrate the method's efficacy in a variety of complex environments. Thus, on average, 60% of the obstacles are detected in two modes of forward movement and right movement. Additionally, a comparison of the suggested method to that of Al-Kaff et al. (2017) [4] demonstrates that it is more efficient than the existing algorithm.The proposed algorithm, however, has some gaps in terms of obstacle detection.One of these limitations is that certain regions lack at least three corresponding points. Also, the presence of incorrect corresponding points causes incorrect detection of obstacles. The second limitation is that the obstacle is the same color as the background, which leads to errors in the correct detection of obstacles. The third constraint is the long processing time required by the suggested approach. These constraints can be solved in the future with the use of more accurate and faster algorithms.

Language:
Persian
Published:
Journal of Geomatics Science and Technology, Volume:11 Issue: 3, 2022
Pages:
63 to 81
magiran.com/p2411095  
دانلود و مطالعه متن این مقاله با یکی از روشهای زیر امکان پذیر است:
اشتراک شخصی
با عضویت و پرداخت آنلاین حق اشتراک یک‌ساله به مبلغ 1,390,000ريال می‌توانید 70 عنوان مطلب دانلود کنید!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی نامحدود همه کاربران به متن مطالب تهیه نمایند!
توجه!
  • حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران می‌شود.
  • پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانه‌های چاپی و دیجیتال را به کاربر نمی‌دهد.
In order to view content subscription is required

Personal subscription
Subscribe magiran.com for 70 € euros via PayPal and download 70 articles during a year.
Organization subscription
Please contact us to subscribe your university or library for unlimited access!