Please use this identifier to cite or link to this item:

Title: Real-Time Vision-Based River Detection and Lateral Shot Following for Autonomous UAVs
Authors: 劉吉軒
Jyi-Shane Liu
Huang, Yenting
Lee, Gongyi
Soong, Rutai
Contributors: 資科系
Keywords: Rivers;Image segmentation;Unmanned aerial vehicles;Navigation;Inspection;Task analysis;Image edge detection
Date: 2020-09
Issue Date: 2021-06-04 14:50:15 (UTC+8)
Abstract: Most existing autonomous UAV inspection tasks focus on environment surroundings and facilities. The UAV often navigates above the inspected target and conducts inspection with the camera aiming downward on the target. However, in some scenarios, it is risky to allow UAVs to navigate above the inspected target. For example, when patrolling a river, the UAV may risk falling into the river. Similar risks also exist for scenarios such as railways and power lines. This research proposes a lateral shot following approach for UAVs to follow the river laterally while collecting image data with a front view camera. The proposed approach has been evaluated with different segments of river in real world environments. The experiments include two types of following method and two types of viewpoint to suit different task needs. Results show that our deep neural network can extract the river masks in real-time with high accuracy. With adaptive steering adjustments, the UAV can achieve accurate and robust following when handling geographical change of river segments. Performance comparison between human operators and our developed autonomous system shows that better following accuracy and consistency can be achieved by our autonomous system.
Relation: Proceedings of the 2020 IEEE International Conference on Real-time Computing and Robotics, IEEE
Data Type: conference
DOI 連結:
Appears in Collections:[資訊科學系] 會議論文

Files in This Item:

File Description SizeFormat
294.pdf1379KbAdobe PDF14View/Open

All items in 學術集成 are protected by copyright, with all rights reserved.

社群 sharing