學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 基於視覺導航之自主無人機環繞檢測
Autonomous UAV Surround Inspection based on Visual Navigation
作者 張為超
Chang, Wei-Chao
貢獻者 劉吉軒
Liu, Jyi-Shane
張為超
Chang, Wei-Chao
關鍵詞 無人機
SLAM
行為樹
建築物檢視
UAV
SLAM
Behavior tree
Building inspection
日期 2020
上傳時間 1-Feb-2021 14:10:48 (UTC+8)
摘要 與在空曠地區的進行的航空影像不同,人造建築物的航空檢視需要無人機進行更複雜的導航。無人機需要以可控的方式向目標物體移動,以獲取結構表面的特寫影像,同時也需要為了自身的安全而避免碰撞到目標建築物。在本文中,我們提出了一項基於視覺導航的人造建築物之自主檢視任務。我們利用SLAM做為視覺定位之基礎,主要針對以圓柱形建築物為路徑之環繞檢視為例,以實際飛行的形式進行了測試。
我們的技術貢獻主要有兩個方面。首先,我們以一個較為完整之任務形式呈現我們的研究,無人機從起飛開始,接著會自主辨識出目標建築物並設定環繞檢視之路徑,在環繞的同時進行實時校正,完成使用者設定之環繞回合數後便會進行返航。其次,我們使用行為樹作為控制體系結構來集成所有功能組件以增強整體之穩定性以及可行性,並在低成本之微型無人機上進行開發。而在現實世界中的實驗表明,無人機可以以一定的成功率執行環繞檢視任務,並且能完整的獲取目標建築物的影像以進行結構檢視。
Unlike aerial imagery in open fields, aerial inspection on man-made construction requires more complex navigation from drones. The drone needs to move toward target object in a controlled manner in order to acquire close-up views on structure surface, at the same time, avoid collision for its own safety. In this paper, we present a research work on autonomous visual navigation for aerial inspection on man-made construction. In particular, we focus on developing orbital inspection of pole-like objects. We use SLAM as the basis for visual positioning and we test our method in the form of actual flight.
There are two main aspects of our technical contribution. First, we present our research in the form of a relatively complete mission. The drone will automatically identify the target building and set the path for the surround view from the start of take-off, and perform real-time adjustment while orbiting to complete the user-defined surround After the number of rounds, it will return home. Secondly, we use behavior tree as a control architecture to integrate all functional components to enhance the overall stability and feasibility, and develop it on a low-cost UAV. Extensive experiments in a real world scenario have shown that UAV can perform surround building inspection tasks with a certain success rate, and can obtain complete images of target buildings for structural inspection.
參考文獻 [1] International Civil Aviation Organization (ICAO). Unmanned Aircraft Systems (UAS); ICAO: Montreal, QC, Canada, 2011.
[2] C. Stöcker, E. Anette, and K. Pierre, "Measuring gullies by synergetic application of UAV and close range photogrammetry—A case study from Andalusia, Spain." Catena vol. 132, pp. 1-11, 2015.
[3] C. Yuan, Y. M. Zhang and Z. X. Liu, "A survey on technologies for automatic forest fire monitoring detection and fighting using unmanned aerial vehicles and remote sensing techniques", Canadian Journal of Forest Research published on the web 12, March 2015.
[4] H. Aasen, E. Honkavaara, A. Lucieer, and P. Zarco-Tejada, “Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows,” Remote Sens., vol. 10, no. 7, p. 1091, 2018.
[5] M. Israel, "A UAV-based roe deer fawn detection system", Proc. Int. Conf. Unmanned Aerial Veh. Geomatics (UAV-g), vol. 38, pp. 1-5, 2011.
[6] M. N. Gillins, D. T. Gillins and C. Parrish, "Cost-effective bridge safety inspections using unmanned aircraft systems (UAS)", Geotechnical and Structural Engineering Congress, 2016.
[7] M. Asim, D. N. Ehsan, and K. Rafique, ‘‘Probable causal factors in Uav accidents based on human factor analysis and classification system,’’ in Proc. 27th Int. Congr. Aeronaut. Sci., vol. 1905, p. 5, 2005.
[8] N. Hallermann and G. Morgenthal, "Visual inspection strategies for large bridges using unmanned aerial vehicles (uav)", Proc. of 7th IABMAS International Conference on Bridge Maintenance Safety and Management, pp. 661-667, 2014.
[9] S. Omari, P. Gohl, M. Burri, M. Achtelik and R. Siegwart, "Visual industrial inspection using aerial robots", Proceedings of CARPI, 2014.
[10] Y. Song, S. Nuske and S. Scherer, "A multi-sensor fusion MAV state estimation from long-range stereo IMU GPS and barometric sensors", Sensors, vol. 17, no. 1, 2017.
[11] S. Ullman, "The interpretation of structure from motion", Proc. R. Soc. London, vol. B203, pp. 405-426, 1979.
[12] J. Engel, V. Koltun and D. Cremers, "Direct sparse odometry", IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 3, pp. 611-625, Mar. 2018.
[13] J. Engel, T. Schöps and D. Cremers, "LSD-SLAM: Large-scale direct monocular SLAM", Proc. Eur. Conf. Comput. Vision, pp. 834-849, Sep. 2014.
[14] A. Buyval, I. Afanasyev and E. Magid, "Comparative analysis of ros-based monocular slam methods for indoor navigation", International Conference on Machine Vision (ICMV 2016), vol. 10341, pp. 103411K, 2017.
[15] R. Mur-Artal, J. M. M. Montiel and J. D. Tardós, "ORB-SLAM: A versatile and accurate monocular SLAM system", IEEE Trans. Robot., vol. 31, no. 5, pp. 1147-1163, Oct. 2015.
[16] M. Filipenko and I. Afanasyev, "Comparison of various slam systems for mobile robot in an indoor environment", International Conference on Intelligent Systems, Sep. 2018.
[17] V. De Araujo, A. P. G. S. Almeida, C. T. Miranda, and F. De Barros Vidal, “A parallel hierarchical finite state machine approach to UAV control for search and rescue tasks,” in Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO `14), pp. 410–415, Sep. 2014.
[18] M. Colledanchise and P. Ögren, "How behavior trees modularize hybrid control systems and generalize sequential behavior compositions the subsumption architecture and decision trees", IEEE Trans. Robot., vol. 33, no. 2, pp. 372-389, Apr. 2017.
[19] M. Samkuma, Y. Kobayashi, T. Emaru and A. Ravankar, "Mapping of Pier Substructure Using UAV", IEEE/SICE International Symposium on System Integration, 2016.
[20] P. Shanthakumar, K. Yu, M. Singh, J. Orevillo, E. Bianchi, M. Hebdon, et al., "View planning and navigation algorithms for autonomous bridge inspection with uavs", International Symposium on Experimental Robotics, pp. 201-210, 2018.
[21] A. Al-Kaff, F. M. Moreno, L. J. San José, F. García, D. Martín, A. De La Escalera, et al., "Vbii-uav: Vision-based infrastructure inspection-uav", World Conference on Information Systems and Technologies WorldCist`17, pp. 221-231, 2017.
[22] F. Kendoul, "Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems," Journal of Field Robotics, vol. 29, no. 2, pp. 315-378, Mar. 2012.
[23] I. Sa, S. Hrabar and P. Corke, "Outdoor flight testing of a pole inspection UAV incorporating high-speed vision", Springer Tracts Adv. Robot., vol. 105, pp. 107-121, Dec. 2015.
[24] S. A. K. Tareen and Z. Saleem, “A comparative analysis of sift, surf, kaze, akaze, orb, and brisk,” in 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), pp. 1–10, March 2018
[25] M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applicatio ns to image analysis and automated cartography,” Conzmun. ACM, vol. 24, pp. 381-395, June 1981.
[26] G. Shi, X. Xu, and Y. Dai, ‘‘SIFT feature point matching based on improved RANSAC algorithm,’’ in Proc. 5th Int. Conf. Intell. Hum.- Mach. Syst. Cybern., vol. 1, pp. 474–477, Aug. 2013.
[27] H. Strasdat, J. M. M. Montiel and A. J. Davison, "Scale drift-aware large scale monocular SLAM", Proc. Robot.: Sci. Syst., Jun. 2010.
[28] S. Choi, P. Jaehyun and Y. Wonpil, "Resolving scale ambiguity for monocular Visual Odometry", IEEE International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 604-608, 2013.
[29] J. Heinly, E. Dunn and J.-M. Frahm, "Comparative evaluation of binary features", European Conf. Comput. Vision, pp. 759-773, 2012.
[30] A. Sujiwo et al., "Robust and accurate monocular vision-based localization in outdoor environments of real-world robot challenge", J. Robot. Mechatronics, vol. 29, no. 4, pp. 685-696, 2017.
[31] Parrot Drones SAS (n.d.). Retrieved October 4, 2020, from https://support.parrot.com/global/support/products
[32] Bebop_autonomy. (n.d.). Retrieved October 4, 2020, from https://bebopautonomy. readthedocs.io/en/latest.
[33] I. Abdel-Qader , O. Abudayyeh, and M. E. Kelly, “Analysis of edge-detection techniques for crack identification in bridges,” J. Comput. Civil Eng. , vol. 17, no. 4, pp. 255–263 , Oct. 2003.
描述 碩士
國立政治大學
資訊科學系
107753035
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0107753035
資料類型 thesis
dc.contributor.advisor 劉吉軒zh_TW
dc.contributor.advisor Liu, Jyi-Shaneen_US
dc.contributor.author (Authors) 張為超zh_TW
dc.contributor.author (Authors) Chang, Wei-Chaoen_US
dc.creator (作者) 張為超zh_TW
dc.creator (作者) Chang, Wei-Chaoen_US
dc.date (日期) 2020en_US
dc.date.accessioned 1-Feb-2021 14:10:48 (UTC+8)-
dc.date.available 1-Feb-2021 14:10:48 (UTC+8)-
dc.date.issued (上傳時間) 1-Feb-2021 14:10:48 (UTC+8)-
dc.identifier (Other Identifiers) G0107753035en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/133895-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學系zh_TW
dc.description (描述) 107753035zh_TW
dc.description.abstract (摘要) 與在空曠地區的進行的航空影像不同,人造建築物的航空檢視需要無人機進行更複雜的導航。無人機需要以可控的方式向目標物體移動,以獲取結構表面的特寫影像,同時也需要為了自身的安全而避免碰撞到目標建築物。在本文中,我們提出了一項基於視覺導航的人造建築物之自主檢視任務。我們利用SLAM做為視覺定位之基礎,主要針對以圓柱形建築物為路徑之環繞檢視為例,以實際飛行的形式進行了測試。
我們的技術貢獻主要有兩個方面。首先,我們以一個較為完整之任務形式呈現我們的研究,無人機從起飛開始,接著會自主辨識出目標建築物並設定環繞檢視之路徑,在環繞的同時進行實時校正,完成使用者設定之環繞回合數後便會進行返航。其次,我們使用行為樹作為控制體系結構來集成所有功能組件以增強整體之穩定性以及可行性,並在低成本之微型無人機上進行開發。而在現實世界中的實驗表明,無人機可以以一定的成功率執行環繞檢視任務,並且能完整的獲取目標建築物的影像以進行結構檢視。
zh_TW
dc.description.abstract (摘要) Unlike aerial imagery in open fields, aerial inspection on man-made construction requires more complex navigation from drones. The drone needs to move toward target object in a controlled manner in order to acquire close-up views on structure surface, at the same time, avoid collision for its own safety. In this paper, we present a research work on autonomous visual navigation for aerial inspection on man-made construction. In particular, we focus on developing orbital inspection of pole-like objects. We use SLAM as the basis for visual positioning and we test our method in the form of actual flight.
There are two main aspects of our technical contribution. First, we present our research in the form of a relatively complete mission. The drone will automatically identify the target building and set the path for the surround view from the start of take-off, and perform real-time adjustment while orbiting to complete the user-defined surround After the number of rounds, it will return home. Secondly, we use behavior tree as a control architecture to integrate all functional components to enhance the overall stability and feasibility, and develop it on a low-cost UAV. Extensive experiments in a real world scenario have shown that UAV can perform surround building inspection tasks with a certain success rate, and can obtain complete images of target buildings for structural inspection.
en_US
dc.description.tableofcontents 第一章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 3
1.3 論文架構 5
1.4 研究成果與貢獻 6
第二章 文獻探討 7
2.1 同步建模與即時定位 7
2.2 行為樹 10
2.3 建築物檢視之案例 13
第三章 基於視覺導航之無人機環繞檢測 15
3.1 特徵點比對 16
3.2 ORB-SLAM 17
3.3環繞路徑生成 20
3.4 路徑導航 21
3.5錯誤回復 23
3.6行為樹 25
第四章 實驗設計及結果分析 28
4.1 實驗設計 28
4.2 實驗評估指標設計 30
4.3 實驗結果與分析 33
4.3.1 無人機與預設路徑之偏差 34
4.3.2 環繞路徑之方向偏差 35
4.4 自主無人機環繞檢視之路徑俯視圖 36
4.5 實驗結果討論 39
第五章 結論及未來展望 41
5.1 研究結論 41
5.2 未來展望 42
參考文獻 44
zh_TW
dc.format.extent 1869058 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0107753035en_US
dc.subject (關鍵詞) 無人機zh_TW
dc.subject (關鍵詞) SLAMzh_TW
dc.subject (關鍵詞) 行為樹zh_TW
dc.subject (關鍵詞) 建築物檢視zh_TW
dc.subject (關鍵詞) UAVen_US
dc.subject (關鍵詞) SLAMen_US
dc.subject (關鍵詞) Behavior treeen_US
dc.subject (關鍵詞) Building inspectionen_US
dc.title (題名) 基於視覺導航之自主無人機環繞檢測zh_TW
dc.title (題名) Autonomous UAV Surround Inspection based on Visual Navigationen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1] International Civil Aviation Organization (ICAO). Unmanned Aircraft Systems (UAS); ICAO: Montreal, QC, Canada, 2011.
[2] C. Stöcker, E. Anette, and K. Pierre, "Measuring gullies by synergetic application of UAV and close range photogrammetry—A case study from Andalusia, Spain." Catena vol. 132, pp. 1-11, 2015.
[3] C. Yuan, Y. M. Zhang and Z. X. Liu, "A survey on technologies for automatic forest fire monitoring detection and fighting using unmanned aerial vehicles and remote sensing techniques", Canadian Journal of Forest Research published on the web 12, March 2015.
[4] H. Aasen, E. Honkavaara, A. Lucieer, and P. Zarco-Tejada, “Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows,” Remote Sens., vol. 10, no. 7, p. 1091, 2018.
[5] M. Israel, "A UAV-based roe deer fawn detection system", Proc. Int. Conf. Unmanned Aerial Veh. Geomatics (UAV-g), vol. 38, pp. 1-5, 2011.
[6] M. N. Gillins, D. T. Gillins and C. Parrish, "Cost-effective bridge safety inspections using unmanned aircraft systems (UAS)", Geotechnical and Structural Engineering Congress, 2016.
[7] M. Asim, D. N. Ehsan, and K. Rafique, ‘‘Probable causal factors in Uav accidents based on human factor analysis and classification system,’’ in Proc. 27th Int. Congr. Aeronaut. Sci., vol. 1905, p. 5, 2005.
[8] N. Hallermann and G. Morgenthal, "Visual inspection strategies for large bridges using unmanned aerial vehicles (uav)", Proc. of 7th IABMAS International Conference on Bridge Maintenance Safety and Management, pp. 661-667, 2014.
[9] S. Omari, P. Gohl, M. Burri, M. Achtelik and R. Siegwart, "Visual industrial inspection using aerial robots", Proceedings of CARPI, 2014.
[10] Y. Song, S. Nuske and S. Scherer, "A multi-sensor fusion MAV state estimation from long-range stereo IMU GPS and barometric sensors", Sensors, vol. 17, no. 1, 2017.
[11] S. Ullman, "The interpretation of structure from motion", Proc. R. Soc. London, vol. B203, pp. 405-426, 1979.
[12] J. Engel, V. Koltun and D. Cremers, "Direct sparse odometry", IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 3, pp. 611-625, Mar. 2018.
[13] J. Engel, T. Schöps and D. Cremers, "LSD-SLAM: Large-scale direct monocular SLAM", Proc. Eur. Conf. Comput. Vision, pp. 834-849, Sep. 2014.
[14] A. Buyval, I. Afanasyev and E. Magid, "Comparative analysis of ros-based monocular slam methods for indoor navigation", International Conference on Machine Vision (ICMV 2016), vol. 10341, pp. 103411K, 2017.
[15] R. Mur-Artal, J. M. M. Montiel and J. D. Tardós, "ORB-SLAM: A versatile and accurate monocular SLAM system", IEEE Trans. Robot., vol. 31, no. 5, pp. 1147-1163, Oct. 2015.
[16] M. Filipenko and I. Afanasyev, "Comparison of various slam systems for mobile robot in an indoor environment", International Conference on Intelligent Systems, Sep. 2018.
[17] V. De Araujo, A. P. G. S. Almeida, C. T. Miranda, and F. De Barros Vidal, “A parallel hierarchical finite state machine approach to UAV control for search and rescue tasks,” in Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO `14), pp. 410–415, Sep. 2014.
[18] M. Colledanchise and P. Ögren, "How behavior trees modularize hybrid control systems and generalize sequential behavior compositions the subsumption architecture and decision trees", IEEE Trans. Robot., vol. 33, no. 2, pp. 372-389, Apr. 2017.
[19] M. Samkuma, Y. Kobayashi, T. Emaru and A. Ravankar, "Mapping of Pier Substructure Using UAV", IEEE/SICE International Symposium on System Integration, 2016.
[20] P. Shanthakumar, K. Yu, M. Singh, J. Orevillo, E. Bianchi, M. Hebdon, et al., "View planning and navigation algorithms for autonomous bridge inspection with uavs", International Symposium on Experimental Robotics, pp. 201-210, 2018.
[21] A. Al-Kaff, F. M. Moreno, L. J. San José, F. García, D. Martín, A. De La Escalera, et al., "Vbii-uav: Vision-based infrastructure inspection-uav", World Conference on Information Systems and Technologies WorldCist`17, pp. 221-231, 2017.
[22] F. Kendoul, "Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems," Journal of Field Robotics, vol. 29, no. 2, pp. 315-378, Mar. 2012.
[23] I. Sa, S. Hrabar and P. Corke, "Outdoor flight testing of a pole inspection UAV incorporating high-speed vision", Springer Tracts Adv. Robot., vol. 105, pp. 107-121, Dec. 2015.
[24] S. A. K. Tareen and Z. Saleem, “A comparative analysis of sift, surf, kaze, akaze, orb, and brisk,” in 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), pp. 1–10, March 2018
[25] M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applicatio ns to image analysis and automated cartography,” Conzmun. ACM, vol. 24, pp. 381-395, June 1981.
[26] G. Shi, X. Xu, and Y. Dai, ‘‘SIFT feature point matching based on improved RANSAC algorithm,’’ in Proc. 5th Int. Conf. Intell. Hum.- Mach. Syst. Cybern., vol. 1, pp. 474–477, Aug. 2013.
[27] H. Strasdat, J. M. M. Montiel and A. J. Davison, "Scale drift-aware large scale monocular SLAM", Proc. Robot.: Sci. Syst., Jun. 2010.
[28] S. Choi, P. Jaehyun and Y. Wonpil, "Resolving scale ambiguity for monocular Visual Odometry", IEEE International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 604-608, 2013.
[29] J. Heinly, E. Dunn and J.-M. Frahm, "Comparative evaluation of binary features", European Conf. Comput. Vision, pp. 759-773, 2012.
[30] A. Sujiwo et al., "Robust and accurate monocular vision-based localization in outdoor environments of real-world robot challenge", J. Robot. Mechatronics, vol. 29, no. 4, pp. 685-696, 2017.
[31] Parrot Drones SAS (n.d.). Retrieved October 4, 2020, from https://support.parrot.com/global/support/products
[32] Bebop_autonomy. (n.d.). Retrieved October 4, 2020, from https://bebopautonomy. readthedocs.io/en/latest.
[33] I. Abdel-Qader , O. Abudayyeh, and M. E. Kelly, “Analysis of edge-detection techniques for crack identification in bridges,” J. Comput. Civil Eng. , vol. 17, no. 4, pp. 255–263 , Oct. 2003.
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU202100038en_US