學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 基於視覺導航之無人機自主降落韌性提升
Robustness Enhancement on Visually Guided UAV Autonomous Landing
作者 蔣明憲
Chiang, Min-Hsien
貢獻者 劉吉軒
Liu, Jyi-Shane
蔣明憲
Chiang, Min-Hsien
關鍵詞 無人機
四軸無人機
自主精準降落
降落韌性
降落方法
基於視覺的引導系統
電腦視覺
決策控制
降落標記設計
Uav
Quadcopter
Autonomous Precision Landing
Landing Robustness
Landing Strategy
Vision-based Guidance systems
Computer Vision
Decision Control
Landing Marker Design
日期 2023
上傳時間 1-Feb-2024 11:40:49 (UTC+8)
摘要 近年來,由於軟硬體架構的革新和大環境的變化的影響,飛行無 人機已成為研究的焦點。它具備高機動性和可滯空兩種特性,不論是 用於軍事用途,如無人化遠距偵查和執行特定軍事任務,或者是商業 上的應用,如空中巡檢和影像獲取,都受到廣泛關注。過去三年疫情 的影響,零接觸概念開始備受重視,無人機的發展也逐漸成為焦點之 一。自主降落作為飛行中的最後環節,在無人機智慧化中扮演著關鍵 的角色,這項技術在過去十年中受到廣泛研究。特別是當無人機降落 於各種環境時,視情況需要整合視覺追蹤、軌跡預測、路徑規劃以及 動力算法等技術,以完成降落任務。考慮到在現實場景中可能遇到的 多變情況,降落韌性提昇是一項值得深入研究的重要課題。 鑒於過去多數論文的實驗停留在相對理想的環境下進行,如虛擬及 室內環境,而與現實場景存在一定落差,因此本研究旨在提升無人機 在現實中的降落韌性,以此設計出能在高空、不同風場以及碰到目標 被遮蔽的情況下也能穩定追蹤目標並降落的方法。 為達成上述韌性目標,我們設計出一套基於視覺的自主降落系統, 涵蓋從降落標記設計、降落流程設計、飛行邏輯設計、降落方法設計 以及硬體底層的飛行控制,並在視覺處理上整合多種演算法來互相驗 證並以提升導航韌性,除此之外,透過將視覺反饋與路徑規劃進行整 合,成功設計出一個極具韌性的新型降落方法。
In recent years, due to innovations in both hardware and software architecture and changes in the overall environment, unmanned aerial vehicles (UAVs) have become a focal point of research. They possess two key characteristics: high maneuverability and the ability to loiter in the air, making them of significant interest for a wide range of applications. These applications include military uses such as unmanned long-range reconnaissance and the execution of specific military missions, as well as commercial applications like aerial inspections and image capture. The impact of the COVID-19 pandemic in the past three years has also placed a strong emphasis on the concept of contactless operations, leading to a growing focus on the development of UAVs. Autonomous landing, as the final phase of a flight, plays a crucial role in the intelligence of UAVs and has been extensively researched over the past decade. This technology integrates various techniques, such as visual tracking, trajectory prediction, path planning, and control algorithms, to successfully accomplish landing tasks, especially when UAVs need to land in diverse and unpredictable environments. Given the numerous and variable conditions that UAVs may encounter in real-world scenarios, enhancing landing resilience is an important and worthwhile area of in-depth research. Given that most previous research papers have conducted experiments in relatively ideal environments, such as virtual and indoor settings, which may not accurately reflect real-world scenarios, this study aims to enhance the landing resilience iii of unmanned aerial vehicles (UAVs) in real-world conditions. The goal is to design a method that enables UAVs to stably track and land on a target even in challenging situations, including high altitudes, varying wind conditions, and scenarios where the target may be obscured. To achieve the aforementioned resilience goals, we have designed a visual-based autonomous landing system that encompasses various components, including landing marker design, landing procedure design, flight logic design, landing method design, and low-level hardware flight control. In addition, we have integrated multiple algorithms in visual processing to mutually validate and enhance navigation resilience. Furthermore, by integrating visual feedback with path planning, we have successfully developed a highly resilient novel landing method
參考文獻 [1] Abdulla Al-Kaff, David Martín, Fernando García, Arturo de la Escalera, and José María Armingol. Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Systems with Applications, 92:447–463, 2018. [2] Muhammad Yeasir Arafat, Muhammad Morshed Alam, and Sangman Moh. Visionbased navigation techniques for unmanned aerial vehicles: Review and challenges. Drones, 7(2), 2023. [3] Tomas Baca, Petr Stepan, and Martin Saska. Autonomous landing on a moving car with unmanned aerial vehicle. In 2017 European Conference on Mobile Robots (ECMR), pages 1–6, 2017. [4] Alexandre Borowczyk, Duc-Tien Nguyen, André Phu-Van Nguyen, Dang Quang Nguyen, David Saussié, and Jerome Le Ny. Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle**this work was partially supported by cfi jelf award 32848 and a hardware donation from dji. IFAC-PapersOnLine, 50(1):10488–10494, 2017. 20th IFAC World Congress. [5] Michele Colledanchise and Petter Ögren. Behavior trees in robotics and AI: an introduction. CoRR, abs/1709.00084, 2017. [6] Michele Colledanchise and Petter Ögren. How behavior trees modularize robustness and safety in hybrid systems. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1482–1488, 2014. [7] Beizhen Feng, Xiaofei Yang, Ronghao Wang, Xin Yan, Hongwei She, and Liang Shan. Design and implementation of autonomous takeoff and landing uav systemfor usv platform. In 2022 International Conference on Cyber-Physical Social Intelligence (ICCSI), pages 292–296, 2022. [8] M. Fiala. Artag, a fiducial marker system using digital techniques. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), volume 2, pages 590–596 vol. 2, 2005. [9] Jawhar Ghommam and Maarouf Saad. Autonomous landing of a quadrotor on a moving platform. IEEE Transactions on Aerospace and Electronic Systems, 53(3):1504– 1519, 2017. [10] Adrián González-Sieira, Daniel Cores, Manuel Mucientes, and Alberto Bugarín. Autonomous navigation for uavs managing motion and sensing uncertainty. Robotics and Autonomous Systems, 126:103455, 2020. [11] Elder M. Hemerly. Automatic georeferencing of images acquired by uav’s. In International Journal of Automation and Computing, 2014. [12] Youeyun Jung, Dongjin Lee, and Hyochoong Bang. Close-range vision navigation and guidance for rotary uav autonomous landing. In 2015 IEEE International Conference on Automation Science and Engineering (CASE), pages 342–347, 2015. [13] Azarakhsh Keipour, Guilherme A. S. Pereira, Rogerio Bonatti, Rohit Garg, Puru Rastogi, Geetesh Dubey, and Sebastian Scherer. Visual servoing approach to autonomous uav landing on a moving vehicle. Sensors, 22(17), 2022. [14] Sarantis Kyristsis, Angelos Antonopoulos, Theofilos Chanialakis, Emmanouel Stefanakis, Christos Linardos, Achilles Tripolitsiotis, and Panagiotis Partsinevelos. Towards autonomous modular uav missions: The detection, geo-location and landing paradigm. Sensors, 16(11), 2016. [15] Sven Lange, Niko Sunderhauf, and Peter Protzel. A vision based onboard approach for landing and position control of an autonomous multirotor uav in gps-denied environments. In 2009 International Conference on Advanced Robotics, pages 1–6, 2009. [16] Min-Fan Ricky Lee, Shun-Feng Su, Jie-Wei Eric Yeah, Husan-Ming Huang, and Jonathan Chen. Autonomous landing system for aerial mobile robot cooperation. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS), pages 1306–1311, 2014. [17] Zhou Li, Yang Chen, Hao Lu, Huaiyu Wu, and Lei Cheng. Uav autonomous landing technology based on apriltags vision positioning algorithm. pages 8148–8153, 07 2019. [18] Shanggang Lin, Lianwen Jin, and Ziwei Chen. Real-time monocular vision system for uav autonomous landing in outdoor low-illumination environments. Sensors, 21(18), 2021. [19] Rong Liu, Jianjun Yi, Yajun Zhang, Bo Zhou, Wenlong Zheng, Hailei Wu, Shuqing Cao, and Jinzhen Mu. Vision-guided autonomous landing of multirotor uav on fixed landing marker. In 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), pages 455–458, 2020. [20] Edwin Olson. Apriltag: A robust and flexible visual fiducial system. In 2011 IEEE International Conference on Robotics and Automation, pages 3400–3407, 2011. [21] Umberto Papa and Giuseppe Del Core. Design of sonar sensor model for safe landing of an uav. In 2015 IEEE Metrology for Aerospace (MetroAeroSpace), pages 346– 350, 2015. [22] Aleix Paris, Brett T. Lopez, and Jonathan P. How. Dynamic landing of an autonomous quadrotor on a moving platform in turbulent wind conditions. In 2020 IEEE International Conference on Robotics and Automation (ICRA), pages 9577– 9583, 2020. [23] Tatiana Pavlenko, Martin Schütz, Martin Vossiek, Thomas Walter, and Sergio Montenegro. Wireless local positioning system for controlled uav landing in gnss-denied environment. In 2019 IEEE 5th International Workshop on Metrology for AeroSpace(MetroAeroSpace), pages 171–175, 2019. [24] Riccardo Polvara, Sanjay Sharma, Jian Wan, Andrew Manning, and Robert Sutton. Vision-based autonomous landing of a quadrotor on the perturbed deck of an unmanned surface vehicle. Drones, 2(2), 2018. [25] Hamid Yusuf Putranto, Astria Nur Irfansyah, and Muhammad Attamimi. Identification of safe landing areas with semantic segmentation and contour detection for delivery uav. In 2022 9th International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE), pages 254–257, 2022. [26] Pengrui Qiu, Xiping Yuan, Shu Gan, and Yu Lin. Research on image denoising adaptive algorithm for uav based on visual landing. In 2017 International Conference on Computer Network, Electronic and Automation (ICCNEA), pages 408–411, 2017. [27] Marcos Felipe Santos Rabelo, Alexandre Santos Brandão, and Mário SarcinelliFilho. Landing a uav on static or moving platforms using a formation controller. IEEE Systems Journal, 15(1):37–45, 2021. [28] René Ranftl, Alexey Bochkovskiy, and Vladlen Koltun. Vision transformers for dense prediction, 2021. [29] Liu Ruifeng, Wang Jiasheng, Zhang Haolong, and Tian Mengfan. Research progress and application of behavior tree technology. In 2019 6th International Conference on Behavioral, Economic and Socio-Cultural Computing (BESC), pages 1–4, 2019. [30] David Safadinho, João Ramos, Roberto Ribeiro, Vítor Filipe, João Barroso, and António Pereira. Uav landing using computer vision techniques for human detection. Sensors, 20(3), 2020. [31] Artur Sagitov, Ksenia Shabalina, Roman Lavrenov, and Evgeni Magid. Comparing fiducial marker systems in the presence of occlusion. In 2017 International Conference on Mechanical, System and Control Engineering (ICMSC), pages 377–382, 2017. [32] C.S. Sharp, O. Shakernia, and S.S. Sastry. A vision system for landing an unmanned aerial vehicle. In Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), volume 2, pages 1720–1727 vol.2, 2001. [33] Taemin Shim and Hyochoong Bang. Autonomous landing of uav using vision based approach and pid controller based outer loop. In 2018 18th International Conference on Control, Automation and Systems (ICCAS), pages 876–879, 2018. [34] Tuan Do Trong, Quan Tran Hai, Manh Vu Van, Binh Nguyen Thai, Tung Nguyen Chi, and Truong Nguyen Quang. Autonomous detection and approach tracking of moving ship on the sea by vtol uav based on deep learning technique through simulated real-time on-air image acquisitions. In 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), pages 374–380, 2021. [35] T. K. Venugopalan, Tawfiq Taher, and George Barbastathis. Autonomous landing of an unmanned aerial vehicle on an autonomous marine vehicle. In 2012 Oceans, pages 1–9, 2012. [36] Holger Voos and Haitham Bou-Ammar. Nonlinear tracking and landing controller for quadrotor aerial robots. In 2010 IEEE International Conference on Control Applications, pages 2136–2141, 2010. [37] Zhiqing Wei, Mingyue Zhu, Ning Zhang, Lin Wang, Yingying Zou, Zeyang Meng, Huici Wu, and Zhiyong Feng. Uav-assisted data collection for internet of things: A survey. IEEE Internet of Things Journal, 9(17):15460–15483, 2022. [38] Long Xin, Zimu Tang, Weiqi Gai, and Haobo Liu. Vision-based autonomous landing for the uav: A review. Aerospace, 9(11), 2022. [39] Nguyen Xuan-Mung, Sung Kyung Hong, Ngoc Phi Nguyen, Le Nhu Ngoc Thanh Ha, and Tien-Loc Le. Autonomous quadcopter precision landing onto a heaving platform: New method and experiment. IEEE Access, 8:167192–167202, 2020.
描述 碩士
國立政治大學
資訊科學系
110753208
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0110753208
資料類型 thesis
dc.contributor.advisor 劉吉軒zh_TW
dc.contributor.advisor Liu, Jyi-Shaneen_US
dc.contributor.author (Authors) 蔣明憲zh_TW
dc.contributor.author (Authors) Chiang, Min-Hsienen_US
dc.creator (作者) 蔣明憲zh_TW
dc.creator (作者) Chiang, Min-Hsienen_US
dc.date (日期) 2023en_US
dc.date.accessioned 1-Feb-2024 11:40:49 (UTC+8)-
dc.date.available 1-Feb-2024 11:40:49 (UTC+8)-
dc.date.issued (上傳時間) 1-Feb-2024 11:40:49 (UTC+8)-
dc.identifier (Other Identifiers) G0110753208en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/149647-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學系zh_TW
dc.description (描述) 110753208zh_TW
dc.description.abstract (摘要) 近年來,由於軟硬體架構的革新和大環境的變化的影響,飛行無 人機已成為研究的焦點。它具備高機動性和可滯空兩種特性,不論是 用於軍事用途,如無人化遠距偵查和執行特定軍事任務,或者是商業 上的應用,如空中巡檢和影像獲取,都受到廣泛關注。過去三年疫情 的影響,零接觸概念開始備受重視,無人機的發展也逐漸成為焦點之 一。自主降落作為飛行中的最後環節,在無人機智慧化中扮演著關鍵 的角色,這項技術在過去十年中受到廣泛研究。特別是當無人機降落 於各種環境時,視情況需要整合視覺追蹤、軌跡預測、路徑規劃以及 動力算法等技術,以完成降落任務。考慮到在現實場景中可能遇到的 多變情況,降落韌性提昇是一項值得深入研究的重要課題。 鑒於過去多數論文的實驗停留在相對理想的環境下進行,如虛擬及 室內環境,而與現實場景存在一定落差,因此本研究旨在提升無人機 在現實中的降落韌性,以此設計出能在高空、不同風場以及碰到目標 被遮蔽的情況下也能穩定追蹤目標並降落的方法。 為達成上述韌性目標,我們設計出一套基於視覺的自主降落系統, 涵蓋從降落標記設計、降落流程設計、飛行邏輯設計、降落方法設計 以及硬體底層的飛行控制,並在視覺處理上整合多種演算法來互相驗 證並以提升導航韌性,除此之外,透過將視覺反饋與路徑規劃進行整 合,成功設計出一個極具韌性的新型降落方法。zh_TW
dc.description.abstract (摘要) In recent years, due to innovations in both hardware and software architecture and changes in the overall environment, unmanned aerial vehicles (UAVs) have become a focal point of research. They possess two key characteristics: high maneuverability and the ability to loiter in the air, making them of significant interest for a wide range of applications. These applications include military uses such as unmanned long-range reconnaissance and the execution of specific military missions, as well as commercial applications like aerial inspections and image capture. The impact of the COVID-19 pandemic in the past three years has also placed a strong emphasis on the concept of contactless operations, leading to a growing focus on the development of UAVs. Autonomous landing, as the final phase of a flight, plays a crucial role in the intelligence of UAVs and has been extensively researched over the past decade. This technology integrates various techniques, such as visual tracking, trajectory prediction, path planning, and control algorithms, to successfully accomplish landing tasks, especially when UAVs need to land in diverse and unpredictable environments. Given the numerous and variable conditions that UAVs may encounter in real-world scenarios, enhancing landing resilience is an important and worthwhile area of in-depth research. Given that most previous research papers have conducted experiments in relatively ideal environments, such as virtual and indoor settings, which may not accurately reflect real-world scenarios, this study aims to enhance the landing resilience iii of unmanned aerial vehicles (UAVs) in real-world conditions. The goal is to design a method that enables UAVs to stably track and land on a target even in challenging situations, including high altitudes, varying wind conditions, and scenarios where the target may be obscured. To achieve the aforementioned resilience goals, we have designed a visual-based autonomous landing system that encompasses various components, including landing marker design, landing procedure design, flight logic design, landing method design, and low-level hardware flight control. In addition, we have integrated multiple algorithms in visual processing to mutually validate and enhance navigation resilience. Furthermore, by integrating visual feedback with path planning, we have successfully developed a highly resilient novel landing methoden_US
dc.description.tableofcontents 第一章 緒論 1 1.1 研究背景 1 1.2 研究動機 3 1.3 研究目的 3 第二章 文獻探討 4 2.1 自主降落之韌性探討 4 2.2 韌性技術分析 6 2.3 複雜降落情境 8 2.4 小結 10 第三章 技術架構與功能模組 12 3.1 自主降落系統與降落標記 12 3.1.1 系統架構 12 3.1.2 複合降落標記 14 3.1.3 電子圍籬 17 3.2 無人機視覺處理模組 20 3.2.1 視覺處理架構 20 3.2.2 視覺導航 24 3.2.3 平台水平距離估算 26 3.3 決策模組 27 3.3.1 基於行為樹設計的行為決策模組 . 27 3.3.2 降落方法 29 3.4 其他機制 31 3.4.1 平穩機制 31 3.4.2 降落路徑規劃 34 3.4.3 視覺路徑推測 35 3.5 小結 36 第四章 實驗與結果分析 37 4.1 實驗設備及場地 37 4.1.1 軟體架構與開發環境 37 4.1.2 實驗設備 37 4.1.3 場地布置 39 4.1.4 通訊架構 39 4.1.5 實驗場域 40 4.2 實驗設計 41 4.2.1 評估指標 42 4.2.2 實驗方法 43 4.3 實驗結果與分析 46 4.3.1 第一階段實驗結果 47 4.3.2 第一階段結果分析 48 4.3.3 第二階段實驗結果與分析 52 4.3.4 異常探討 55 4.3.5 3級風降落 59 4.3.6 複雜環境降落 60 4.3.7 研究限制 62 第五章 結論與未來展望 64 5.1 研究結論 64 5.2 未來展望 66 參考文獻 68zh_TW
dc.format.extent 55248318 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0110753208en_US
dc.subject (關鍵詞) 無人機zh_TW
dc.subject (關鍵詞) 四軸無人機zh_TW
dc.subject (關鍵詞) 自主精準降落zh_TW
dc.subject (關鍵詞) 降落韌性zh_TW
dc.subject (關鍵詞) 降落方法zh_TW
dc.subject (關鍵詞) 基於視覺的引導系統zh_TW
dc.subject (關鍵詞) 電腦視覺zh_TW
dc.subject (關鍵詞) 決策控制zh_TW
dc.subject (關鍵詞) 降落標記設計zh_TW
dc.subject (關鍵詞) Uaven_US
dc.subject (關鍵詞) Quadcopteren_US
dc.subject (關鍵詞) Autonomous Precision Landingen_US
dc.subject (關鍵詞) Landing Robustnessen_US
dc.subject (關鍵詞) Landing Strategyen_US
dc.subject (關鍵詞) Vision-based Guidance systemsen_US
dc.subject (關鍵詞) Computer Visionen_US
dc.subject (關鍵詞) Decision Controlen_US
dc.subject (關鍵詞) Landing Marker Designen_US
dc.title (題名) 基於視覺導航之無人機自主降落韌性提升zh_TW
dc.title (題名) Robustness Enhancement on Visually Guided UAV Autonomous Landingen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1] Abdulla Al-Kaff, David Martín, Fernando García, Arturo de la Escalera, and José María Armingol. Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Systems with Applications, 92:447–463, 2018. [2] Muhammad Yeasir Arafat, Muhammad Morshed Alam, and Sangman Moh. Visionbased navigation techniques for unmanned aerial vehicles: Review and challenges. Drones, 7(2), 2023. [3] Tomas Baca, Petr Stepan, and Martin Saska. Autonomous landing on a moving car with unmanned aerial vehicle. In 2017 European Conference on Mobile Robots (ECMR), pages 1–6, 2017. [4] Alexandre Borowczyk, Duc-Tien Nguyen, André Phu-Van Nguyen, Dang Quang Nguyen, David Saussié, and Jerome Le Ny. Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle**this work was partially supported by cfi jelf award 32848 and a hardware donation from dji. IFAC-PapersOnLine, 50(1):10488–10494, 2017. 20th IFAC World Congress. [5] Michele Colledanchise and Petter Ögren. Behavior trees in robotics and AI: an introduction. CoRR, abs/1709.00084, 2017. [6] Michele Colledanchise and Petter Ögren. How behavior trees modularize robustness and safety in hybrid systems. In 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1482–1488, 2014. [7] Beizhen Feng, Xiaofei Yang, Ronghao Wang, Xin Yan, Hongwei She, and Liang Shan. Design and implementation of autonomous takeoff and landing uav systemfor usv platform. In 2022 International Conference on Cyber-Physical Social Intelligence (ICCSI), pages 292–296, 2022. [8] M. Fiala. Artag, a fiducial marker system using digital techniques. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), volume 2, pages 590–596 vol. 2, 2005. [9] Jawhar Ghommam and Maarouf Saad. Autonomous landing of a quadrotor on a moving platform. IEEE Transactions on Aerospace and Electronic Systems, 53(3):1504– 1519, 2017. [10] Adrián González-Sieira, Daniel Cores, Manuel Mucientes, and Alberto Bugarín. Autonomous navigation for uavs managing motion and sensing uncertainty. Robotics and Autonomous Systems, 126:103455, 2020. [11] Elder M. Hemerly. Automatic georeferencing of images acquired by uav’s. In International Journal of Automation and Computing, 2014. [12] Youeyun Jung, Dongjin Lee, and Hyochoong Bang. Close-range vision navigation and guidance for rotary uav autonomous landing. In 2015 IEEE International Conference on Automation Science and Engineering (CASE), pages 342–347, 2015. [13] Azarakhsh Keipour, Guilherme A. S. Pereira, Rogerio Bonatti, Rohit Garg, Puru Rastogi, Geetesh Dubey, and Sebastian Scherer. Visual servoing approach to autonomous uav landing on a moving vehicle. Sensors, 22(17), 2022. [14] Sarantis Kyristsis, Angelos Antonopoulos, Theofilos Chanialakis, Emmanouel Stefanakis, Christos Linardos, Achilles Tripolitsiotis, and Panagiotis Partsinevelos. Towards autonomous modular uav missions: The detection, geo-location and landing paradigm. Sensors, 16(11), 2016. [15] Sven Lange, Niko Sunderhauf, and Peter Protzel. A vision based onboard approach for landing and position control of an autonomous multirotor uav in gps-denied environments. In 2009 International Conference on Advanced Robotics, pages 1–6, 2009. [16] Min-Fan Ricky Lee, Shun-Feng Su, Jie-Wei Eric Yeah, Husan-Ming Huang, and Jonathan Chen. Autonomous landing system for aerial mobile robot cooperation. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS), pages 1306–1311, 2014. [17] Zhou Li, Yang Chen, Hao Lu, Huaiyu Wu, and Lei Cheng. Uav autonomous landing technology based on apriltags vision positioning algorithm. pages 8148–8153, 07 2019. [18] Shanggang Lin, Lianwen Jin, and Ziwei Chen. Real-time monocular vision system for uav autonomous landing in outdoor low-illumination environments. Sensors, 21(18), 2021. [19] Rong Liu, Jianjun Yi, Yajun Zhang, Bo Zhou, Wenlong Zheng, Hailei Wu, Shuqing Cao, and Jinzhen Mu. Vision-guided autonomous landing of multirotor uav on fixed landing marker. In 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), pages 455–458, 2020. [20] Edwin Olson. Apriltag: A robust and flexible visual fiducial system. In 2011 IEEE International Conference on Robotics and Automation, pages 3400–3407, 2011. [21] Umberto Papa and Giuseppe Del Core. Design of sonar sensor model for safe landing of an uav. In 2015 IEEE Metrology for Aerospace (MetroAeroSpace), pages 346– 350, 2015. [22] Aleix Paris, Brett T. Lopez, and Jonathan P. How. Dynamic landing of an autonomous quadrotor on a moving platform in turbulent wind conditions. In 2020 IEEE International Conference on Robotics and Automation (ICRA), pages 9577– 9583, 2020. [23] Tatiana Pavlenko, Martin Schütz, Martin Vossiek, Thomas Walter, and Sergio Montenegro. Wireless local positioning system for controlled uav landing in gnss-denied environment. In 2019 IEEE 5th International Workshop on Metrology for AeroSpace(MetroAeroSpace), pages 171–175, 2019. [24] Riccardo Polvara, Sanjay Sharma, Jian Wan, Andrew Manning, and Robert Sutton. Vision-based autonomous landing of a quadrotor on the perturbed deck of an unmanned surface vehicle. Drones, 2(2), 2018. [25] Hamid Yusuf Putranto, Astria Nur Irfansyah, and Muhammad Attamimi. Identification of safe landing areas with semantic segmentation and contour detection for delivery uav. In 2022 9th International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE), pages 254–257, 2022. [26] Pengrui Qiu, Xiping Yuan, Shu Gan, and Yu Lin. Research on image denoising adaptive algorithm for uav based on visual landing. In 2017 International Conference on Computer Network, Electronic and Automation (ICCNEA), pages 408–411, 2017. [27] Marcos Felipe Santos Rabelo, Alexandre Santos Brandão, and Mário SarcinelliFilho. Landing a uav on static or moving platforms using a formation controller. IEEE Systems Journal, 15(1):37–45, 2021. [28] René Ranftl, Alexey Bochkovskiy, and Vladlen Koltun. Vision transformers for dense prediction, 2021. [29] Liu Ruifeng, Wang Jiasheng, Zhang Haolong, and Tian Mengfan. Research progress and application of behavior tree technology. In 2019 6th International Conference on Behavioral, Economic and Socio-Cultural Computing (BESC), pages 1–4, 2019. [30] David Safadinho, João Ramos, Roberto Ribeiro, Vítor Filipe, João Barroso, and António Pereira. Uav landing using computer vision techniques for human detection. Sensors, 20(3), 2020. [31] Artur Sagitov, Ksenia Shabalina, Roman Lavrenov, and Evgeni Magid. Comparing fiducial marker systems in the presence of occlusion. In 2017 International Conference on Mechanical, System and Control Engineering (ICMSC), pages 377–382, 2017. [32] C.S. Sharp, O. Shakernia, and S.S. Sastry. A vision system for landing an unmanned aerial vehicle. In Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), volume 2, pages 1720–1727 vol.2, 2001. [33] Taemin Shim and Hyochoong Bang. Autonomous landing of uav using vision based approach and pid controller based outer loop. In 2018 18th International Conference on Control, Automation and Systems (ICCAS), pages 876–879, 2018. [34] Tuan Do Trong, Quan Tran Hai, Manh Vu Van, Binh Nguyen Thai, Tung Nguyen Chi, and Truong Nguyen Quang. Autonomous detection and approach tracking of moving ship on the sea by vtol uav based on deep learning technique through simulated real-time on-air image acquisitions. In 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), pages 374–380, 2021. [35] T. K. Venugopalan, Tawfiq Taher, and George Barbastathis. Autonomous landing of an unmanned aerial vehicle on an autonomous marine vehicle. In 2012 Oceans, pages 1–9, 2012. [36] Holger Voos and Haitham Bou-Ammar. Nonlinear tracking and landing controller for quadrotor aerial robots. In 2010 IEEE International Conference on Control Applications, pages 2136–2141, 2010. [37] Zhiqing Wei, Mingyue Zhu, Ning Zhang, Lin Wang, Yingying Zou, Zeyang Meng, Huici Wu, and Zhiyong Feng. Uav-assisted data collection for internet of things: A survey. IEEE Internet of Things Journal, 9(17):15460–15483, 2022. [38] Long Xin, Zimu Tang, Weiqi Gai, and Haobo Liu. Vision-based autonomous landing for the uav: A review. Aerospace, 9(11), 2022. [39] Nguyen Xuan-Mung, Sung Kyung Hong, Ngoc Phi Nguyen, Le Nhu Ngoc Thanh Ha, and Tien-Loc Le. Autonomous quadcopter precision landing onto a heaving platform: New method and experiment. IEEE Access, 8:167192–167202, 2020.zh_TW