Publications-Theses

Article View/Open

Publication Export

Google ScholarTM

NCCU Library

Citation Infomation

Related Publications in TAIR

題名 無人機前方視野精準輪廓線條跟隨方法之研究
Accurate contour line-following methods for UAV forward view
作者 李恭儀
Lee, Gong-Yi
貢獻者 劉吉軒
Liu, Jyi-Shane
李恭儀
Lee, Gong-Yi
關鍵詞 無人機
自主飛行控制
輪廓線條跟隨
二維方向向量機率模型
UAV
Autonomous flight control
Contour line following
Two-Dimensional direction vector probability model
日期 2018
上傳時間 23-Jan-2019 14:55:20 (UTC+8)
摘要 現有的線條跟隨技術主要讓無人機跟隨下方視野的線條,以等高定速向前跟隨,並調整轉向角改變跟隨方向。然而當無人機跟隨前方視野的線條時,調整轉向角會使無人機前方視野的線條消失,而且現有的線條跟隨不會再偵測到跟隨過的線條,但前方視野依然能偵測到跟隨過的線條,需要固定前方視野來決定跟隨方向。而且當無人機改變方向飛行時,定速所產生的移動慣性會使無人機偏離原本的飛行路徑。除此之外,現有的研究亦存在一些問題,如線條的錯誤偵測、沒有固定的方式評估線條跟隨的表現,以及未要求跟隨的精準度。
因此本研究提出二維方向向量機率模型,解決無人機跟隨前方視野的線條時所產生的方向性問題以及避免線條的錯誤偵測影響跟隨。本研究以雙層二維方向向量機率模型,搭配慣性速度抑制方法,能夠抑制無人機改變方向飛行時所產生的移動慣性,使無人機能夠快速且精準的進行線條跟隨。本研究提出兩種評估指標1. 無人機視覺中心位於目標路徑寬度以內之程度以及2. 無人機視覺中心偏移目標路徑寬度以外的位移誤差,評估無人機進行精準的線條跟隨時的表現。
最後本研究透過在真實世界的實驗,驗證提出跟隨方法的可行性、穩定性以及精準性。以先前研究中最能精準跟隨線條的基於向量域的線條跟隨作為基準,本研究所提出的方法經過兩種評估指標進行評估後,皆比基準表現得更好。其中,雙層二維方向向量機率模型搭配慣性速度抑制方法的表現最為突出,該方法具備選擇飛行方向、校正自身位置以及慣性速度抑制的功能,能跟隨複雜的線條。經過真實世界的考驗,本研究提出的方法能實際應用在真實世界上。未來能針對前方視野的線條跟隨研究進行更進一步的改進與延伸,包含了移動時的穩定性、改進位移誤差以及戶外實際應用,如高壓電塔檢測、摩天樓設備安檢等任務。
Majority of existing line following techniques focused on allowing the drone to follow lines located bottom of the drone’s front view camera. The drone are often in constant speed and changes its following direction by adjusting its steering angle. However, when the drone needs to follow lines located vertically at the center of the front view camera, adjusting the steering angle will make the line disappeared from the drone’s vision. Even though the previously followed line can still be seen in the front view camera, current existing line following techniques cannot detect a previously followed the line, the line needs to be fixed in front of the view to determine the direction of the follow. Moreover, when the drone changes direction, the moving inertia will cause the drone to deviate from the original flight path. In addition, there are still rooms of improvement for existing research, such as error detection of lines, lack of common methods to evaluate the performance of line following and does not include line following accuracy as performance measurement.
Therefore, this study proposes a two-dimensional directional vector probability model to solve the directionality problem caused by the drone following the line of the front view and to avoid error detection of the line. In this study, the two-layer two-dimensional vector probability model and the inertial speed suppression method can suppress the moving inertia generated by the UAV when changing direction, allowing the drone to follow line quickly and accurately. This study also proposes two evaluation indicators to evaluate the performance of the drone for precise line following: 1) The degree of UAV vision center within the width of the target path, 2) The displacement error of the drone vision center from the target path.
Finally, this study verifies the feasibility, stability and accuracy of the proposed method by experimenting in the real world. Based on the most accurate vector-based line following methods in the previous study as the benchmark, using our proposed evaluation methods, the proposed method in this study performs better than the benchmark. Among them, the two-layer two-dimensional direction vector probability model and the inertial speed suppression method are the most prominent, it has the capability of selecting flight direction, correcting its own position and suppressing the inertia speed, and can follow complex lines. After experimenting in the real world, the method proposed in this study can be practically applied in the real world. In future, we can further improve and extend the line following research for front view, including stability during movement, improvement for displacement error and practical applications for outdoor, such as high-voltage tower inspection and security inspection for skyscraper facilities.
參考文獻 [1]Khan, M. I., Salam, M. A., Afsar, M. R., Huda, M. N., & Mahmud, T. (2016, July). Design, fabrication & performance analysis of an unmanned aerial vehicle. In AIP Conference Proceedings (Vol. 1754, No. 1, p. 060007). AIP Publishing.
[2]Intel Corporation. (n.d.). Retrieved January 9, 2018, from https://click.intel.com/intel-aero-ready-to-fly-drone.html
[3]DJI M200. (n.d.). Retrieved November 23, 2018, from https://www.dji.com/zh-tw/matrice-200-series?site=brandsite&from=nav
[4]Parrot Drones SAS. (n.d.). Retrieved October 2, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2.
[5]Punetha, D., Kumar, N., & Mehta, V. (2013). Development and Applications of Line Following Robot Based Health Care Management System.International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), 2(8), 2446-2450.
[6]Dupuis, J. F., & Parizeau, M. (2006, June). Evolving a vision-based line-following robot controller. In Computer and Robot Vision, 2006. The 3rd Canadian Conference on (pp. 75-75). IEEE.
[7]Nelson, D. R., Barber, D. B., McLain, T. W., & Beard, R. W. (2007). Vector field path following for miniature air vehicles. IEEE Transactions on Robotics, 23(3), 519-529.
[8]Sujit, P. B., Saripalli, S., & Sousa, J. B. (2013, July). An evaluation of UAV path following algorithms. In Control Conference (ECC), 2013 European (pp. 3332-3337). IEEE.
[9]Brandao, A. S., Martins, F. N., & Soneguetti, H. B. (2015, July). A vision-based line following strategy for an autonomous uav. In Informatics in Control, Automation and Robotics (ICINCO), 2015 12th International Conference on (Vol. 2, pp. 314-319). IEEE.
[10]Martinez, C., Sampedro, C., Chauhan, A., & Campoy, P. (2014, May). Towards autonomous detection and tracking of electric towers for aerial power line inspection. In Unmanned Aircraft Systems (ICUAS), 2014 International Conference on (pp. 284-295). IEEE.
[11]Choi, S. S., & Kim, E. K. (2015, July). Building crack inspection using small UAV. In Advanced Communication Technology (ICACT), 2015 17th International Conference on (pp. 235-238). IEEE.
[12]Nguyen, V. N., Jenssen, R., & Roverso, D. (2018). Automatic autonomous vision-based power line inspection: A review of current status and the potential role of deep learning. International Journal of Electrical Power & Energy Systems, 99, 107-120.
[13]Ghani, N. M. A., Naim, F., & Yon, T. P. (2011). Two wheels balancing robot with line following capability. World Academy of Science, Engineering and Technology, 55, 634-638.
[14]Páll, E., Mathe, K., Tamas, L., & Busoniu, L. (2014, May). Railway track following with the AR. Drone using vanishing point detection. In Automation, Quality and Testing, Robotics, 2014 IEEE International Conference on (pp. 1-6). IEEE.
[15]Cerón, A., Mondragón, I., & Prieto, F. (2018). Onboard visual-based navigation system for power line following with UAV. International Journal of Advanced Robotic Systems, 15(2), 1729881418763452.
[16]Hartley, R., Kamgar-Parsi, B., & Narber, C. (2018). Using Roads for Autonomous Air Vehicle Guidance. IEEE Transactions on Intelligent Transportation Systems.
[17]Shen, W., Wang, X., Wang, Y., Bai, X., & Zhang, Z. (2015). Deepcontour: A deep convolutional feature learned by positive-sharing loss for contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 3982-3991).
[18]Bertasius, G., Shi, J., & Torresani, L. (2015). Deepedge: A multi-scale bifurcated deep network for top-down contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4380-4389).
[19]Arbelaez, P., Maire, M., Fowlkes, C., & Malik, J. (2011). Contour detection and hierarchical image segmentation. IEEE transactions on pattern analysis and machine intelligence, 33(5), 898-916.
[20]Barton, M. J. (2001). Controller development and implementation for path planning and following in an autonomous urban vehicle. Undergraduate thesis, University of Sydney.
[21]Park, S., Deyst, J., & How, J. P. (2007). Performance and lyapunov stability of a nonlinear path following guidance method. Journal of Guidance, Control, and Dynamics, 30(6), 1718-1728.
[22]Kothari, M., Postlethwaite, I., & Gu, D. W. (2010). A Suboptimal Path Planning Algorithm Using Rapidly-exploring Random Trees. International Journal of Aerospace Innovations, 2.
[23]Ratnoo, A., Sujit, P. B., & Kothari, M. (2011, September). Adaptive optimal path following for high wind flights. In 18th International Federation of Automatic Control (IFAC) World Congress (pp. 12-985).
[24]Su, J. H., Lee, C. S., Huang, H. H., Chuang, S. H., & Lin, C. Y. (2010). An intelligent line-following robot project for introductory robot courses. World Transactions on Engineering and Technology Education, 8(4), 455-461.
[25]Khalife, J., Shamaei, K., Bhattacharya, S., & Kassas, Z. (2018, September). Centimeter-accurate UAV navigation with cellular signals. In Proceedings of ION GNSS Conference.
[26]Kothari, M., Postlethwaite, I., & Gu, D. W. (2014). UAV path following in windy urban environments. Journal of Intelligent & Robotic Systems, 74(3-4), 1013-1028.
[27]Sipser, M. (2006). Introduction to the Theory of Computation(Vol. 2). Boston: Thomson Course Technology.
[28]Alsalam, B. H. Y., Morton, K., Campbell, D., & Gonzalez, F. (2017, March). Autonomous UAV with vision based on-board decision making for remote sensing and precision agriculture. In Aerospace Conference, 2017 IEEE (pp. 1-12). IEEE.
[29]Puterman, M. L. (2014). Markov decision processes: discrete stochastic dynamic programming. John Wiley & Sons.
[30]Koenig, J., Malberg, S., Martens, M., Niehaus, S., Krohn-Grimberghe, A., & Ramaswamy, A. (2018). Multi-Stage Reinforcement Learning For Object Detection. arXiv preprint arXiv:1810.10325.
[31]Xiang, Y., Alahi, A., & Savarese, S. (2015). Learning to track: Online multi-object tracking by decision making. In Proceedings of the IEEE international conference on computer vision (pp. 4705-4713).
[32]Turchetta, M., Berkenkamp, F., & Krause, A. (2016). Safe exploration in finite markov decision processes with gaussian processes. In Advances in Neural Information Processing Systems (pp. 4312-4320).
[33]Ferreira, L. A., Bianchi, R. A., Santos, P. E., & de Mantaras, R. L. (2018, June). A method for the online construction of the set of states of a Markov Decision Process using Answer Set Programming. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems (pp. 3-15). Springer, Cham.
[34]Suprayitno, H., Ratnasari, V., & Saraswati, N. (2017, November). Experiment Design for Determining the Minimum Sample Size for Developing Sample Based Trip Length Distribution. In IOP Conference Series: Materials Science and Engineering (Vol. 267, No. 1, p. 012029). IOP Publishing.
[35]Parrot Bebop2 Power Technical specifications. (n.d.). Retrieved November 13, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2-power-pack-fpv#technicals.
[36]Bebop_autonomy. (n.d.). Retrieved November 13, 2018, from https://bebop-autonomy.readthedocs.io/en/latest.
描述 碩士
國立政治大學
資訊科學系
105753006
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0105753006
資料類型 thesis
dc.contributor.advisor 劉吉軒zh_TW
dc.contributor.advisor Liu, Jyi-Shaneen_US
dc.contributor.author (Authors) 李恭儀zh_TW
dc.contributor.author (Authors) Lee, Gong-Yien_US
dc.creator (作者) 李恭儀zh_TW
dc.creator (作者) Lee, Gong-Yien_US
dc.date (日期) 2018en_US
dc.date.accessioned 23-Jan-2019 14:55:20 (UTC+8)-
dc.date.available 23-Jan-2019 14:55:20 (UTC+8)-
dc.date.issued (上傳時間) 23-Jan-2019 14:55:20 (UTC+8)-
dc.identifier (Other Identifiers) G0105753006en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/122133-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學系zh_TW
dc.description (描述) 105753006zh_TW
dc.description.abstract (摘要) 現有的線條跟隨技術主要讓無人機跟隨下方視野的線條,以等高定速向前跟隨,並調整轉向角改變跟隨方向。然而當無人機跟隨前方視野的線條時,調整轉向角會使無人機前方視野的線條消失,而且現有的線條跟隨不會再偵測到跟隨過的線條,但前方視野依然能偵測到跟隨過的線條,需要固定前方視野來決定跟隨方向。而且當無人機改變方向飛行時,定速所產生的移動慣性會使無人機偏離原本的飛行路徑。除此之外,現有的研究亦存在一些問題,如線條的錯誤偵測、沒有固定的方式評估線條跟隨的表現,以及未要求跟隨的精準度。
因此本研究提出二維方向向量機率模型,解決無人機跟隨前方視野的線條時所產生的方向性問題以及避免線條的錯誤偵測影響跟隨。本研究以雙層二維方向向量機率模型,搭配慣性速度抑制方法,能夠抑制無人機改變方向飛行時所產生的移動慣性,使無人機能夠快速且精準的進行線條跟隨。本研究提出兩種評估指標1. 無人機視覺中心位於目標路徑寬度以內之程度以及2. 無人機視覺中心偏移目標路徑寬度以外的位移誤差,評估無人機進行精準的線條跟隨時的表現。
最後本研究透過在真實世界的實驗,驗證提出跟隨方法的可行性、穩定性以及精準性。以先前研究中最能精準跟隨線條的基於向量域的線條跟隨作為基準,本研究所提出的方法經過兩種評估指標進行評估後,皆比基準表現得更好。其中,雙層二維方向向量機率模型搭配慣性速度抑制方法的表現最為突出,該方法具備選擇飛行方向、校正自身位置以及慣性速度抑制的功能,能跟隨複雜的線條。經過真實世界的考驗,本研究提出的方法能實際應用在真實世界上。未來能針對前方視野的線條跟隨研究進行更進一步的改進與延伸,包含了移動時的穩定性、改進位移誤差以及戶外實際應用,如高壓電塔檢測、摩天樓設備安檢等任務。
zh_TW
dc.description.abstract (摘要) Majority of existing line following techniques focused on allowing the drone to follow lines located bottom of the drone’s front view camera. The drone are often in constant speed and changes its following direction by adjusting its steering angle. However, when the drone needs to follow lines located vertically at the center of the front view camera, adjusting the steering angle will make the line disappeared from the drone’s vision. Even though the previously followed line can still be seen in the front view camera, current existing line following techniques cannot detect a previously followed the line, the line needs to be fixed in front of the view to determine the direction of the follow. Moreover, when the drone changes direction, the moving inertia will cause the drone to deviate from the original flight path. In addition, there are still rooms of improvement for existing research, such as error detection of lines, lack of common methods to evaluate the performance of line following and does not include line following accuracy as performance measurement.
Therefore, this study proposes a two-dimensional directional vector probability model to solve the directionality problem caused by the drone following the line of the front view and to avoid error detection of the line. In this study, the two-layer two-dimensional vector probability model and the inertial speed suppression method can suppress the moving inertia generated by the UAV when changing direction, allowing the drone to follow line quickly and accurately. This study also proposes two evaluation indicators to evaluate the performance of the drone for precise line following: 1) The degree of UAV vision center within the width of the target path, 2) The displacement error of the drone vision center from the target path.
Finally, this study verifies the feasibility, stability and accuracy of the proposed method by experimenting in the real world. Based on the most accurate vector-based line following methods in the previous study as the benchmark, using our proposed evaluation methods, the proposed method in this study performs better than the benchmark. Among them, the two-layer two-dimensional direction vector probability model and the inertial speed suppression method are the most prominent, it has the capability of selecting flight direction, correcting its own position and suppressing the inertia speed, and can follow complex lines. After experimenting in the real world, the method proposed in this study can be practically applied in the real world. In future, we can further improve and extend the line following research for front view, including stability during movement, improvement for displacement error and practical applications for outdoor, such as high-voltage tower inspection and security inspection for skyscraper facilities.
en_US
dc.description.tableofcontents 第 1 章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 3
1.3 論文架構 5
1.4 研究成果與貢獻 6
第 2 章 文獻探討 8
2.1 線條跟隨 8
2.2 狀態轉換 11
2.3 小結 14
第 3 章 前方視野輪廓線條跟隨的方向判斷模型 15
3.1 輪廓線條偵測 16
3.2 基於向量域的線條跟隨方法 19
3.3 單層二維方向向量機率模型 22
3.4 雙層二維方向向量機率模型 29
3.5 慣性速度抑制方法 34
第 4 章 實驗設計與結果分析 37
4.1 實驗設計 37
4.1.1 評估指標計算 41
4.2 實驗資料 44
4.2.1 實驗資料之平均時間點結果與測試圖形之複雜性分析 46
4.2.2 實驗資料之任務花費時間結果與測試圖形之複雜性分析 47
4.3 實驗結果與分析 49
4.3.1 無人機視覺中心位於目標路徑寬度以內之程度實驗結果 50
4.3.2 無人機視覺中心偏移目標路徑寬度以外的位移誤差之實驗結果 58
4.3.3 轉角飛行表現之實驗結果 64
4.4 轉折度數之探討 69
4.5 小結 71
第 5 章 結論與未來展望 72
5.1 研究結論 72
5.2 未來展望 73
References 75
附錄 80
zh_TW
dc.format.extent 3055443 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0105753006en_US
dc.subject (關鍵詞) 無人機zh_TW
dc.subject (關鍵詞) 自主飛行控制zh_TW
dc.subject (關鍵詞) 輪廓線條跟隨zh_TW
dc.subject (關鍵詞) 二維方向向量機率模型zh_TW
dc.subject (關鍵詞) UAVen_US
dc.subject (關鍵詞) Autonomous flight controlen_US
dc.subject (關鍵詞) Contour line followingen_US
dc.subject (關鍵詞) Two-Dimensional direction vector probability modelen_US
dc.title (題名) 無人機前方視野精準輪廓線條跟隨方法之研究zh_TW
dc.title (題名) Accurate contour line-following methods for UAV forward viewen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1]Khan, M. I., Salam, M. A., Afsar, M. R., Huda, M. N., & Mahmud, T. (2016, July). Design, fabrication & performance analysis of an unmanned aerial vehicle. In AIP Conference Proceedings (Vol. 1754, No. 1, p. 060007). AIP Publishing.
[2]Intel Corporation. (n.d.). Retrieved January 9, 2018, from https://click.intel.com/intel-aero-ready-to-fly-drone.html
[3]DJI M200. (n.d.). Retrieved November 23, 2018, from https://www.dji.com/zh-tw/matrice-200-series?site=brandsite&from=nav
[4]Parrot Drones SAS. (n.d.). Retrieved October 2, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2.
[5]Punetha, D., Kumar, N., & Mehta, V. (2013). Development and Applications of Line Following Robot Based Health Care Management System.International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), 2(8), 2446-2450.
[6]Dupuis, J. F., & Parizeau, M. (2006, June). Evolving a vision-based line-following robot controller. In Computer and Robot Vision, 2006. The 3rd Canadian Conference on (pp. 75-75). IEEE.
[7]Nelson, D. R., Barber, D. B., McLain, T. W., & Beard, R. W. (2007). Vector field path following for miniature air vehicles. IEEE Transactions on Robotics, 23(3), 519-529.
[8]Sujit, P. B., Saripalli, S., & Sousa, J. B. (2013, July). An evaluation of UAV path following algorithms. In Control Conference (ECC), 2013 European (pp. 3332-3337). IEEE.
[9]Brandao, A. S., Martins, F. N., & Soneguetti, H. B. (2015, July). A vision-based line following strategy for an autonomous uav. In Informatics in Control, Automation and Robotics (ICINCO), 2015 12th International Conference on (Vol. 2, pp. 314-319). IEEE.
[10]Martinez, C., Sampedro, C., Chauhan, A., & Campoy, P. (2014, May). Towards autonomous detection and tracking of electric towers for aerial power line inspection. In Unmanned Aircraft Systems (ICUAS), 2014 International Conference on (pp. 284-295). IEEE.
[11]Choi, S. S., & Kim, E. K. (2015, July). Building crack inspection using small UAV. In Advanced Communication Technology (ICACT), 2015 17th International Conference on (pp. 235-238). IEEE.
[12]Nguyen, V. N., Jenssen, R., & Roverso, D. (2018). Automatic autonomous vision-based power line inspection: A review of current status and the potential role of deep learning. International Journal of Electrical Power & Energy Systems, 99, 107-120.
[13]Ghani, N. M. A., Naim, F., & Yon, T. P. (2011). Two wheels balancing robot with line following capability. World Academy of Science, Engineering and Technology, 55, 634-638.
[14]Páll, E., Mathe, K., Tamas, L., & Busoniu, L. (2014, May). Railway track following with the AR. Drone using vanishing point detection. In Automation, Quality and Testing, Robotics, 2014 IEEE International Conference on (pp. 1-6). IEEE.
[15]Cerón, A., Mondragón, I., & Prieto, F. (2018). Onboard visual-based navigation system for power line following with UAV. International Journal of Advanced Robotic Systems, 15(2), 1729881418763452.
[16]Hartley, R., Kamgar-Parsi, B., & Narber, C. (2018). Using Roads for Autonomous Air Vehicle Guidance. IEEE Transactions on Intelligent Transportation Systems.
[17]Shen, W., Wang, X., Wang, Y., Bai, X., & Zhang, Z. (2015). Deepcontour: A deep convolutional feature learned by positive-sharing loss for contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 3982-3991).
[18]Bertasius, G., Shi, J., & Torresani, L. (2015). Deepedge: A multi-scale bifurcated deep network for top-down contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4380-4389).
[19]Arbelaez, P., Maire, M., Fowlkes, C., & Malik, J. (2011). Contour detection and hierarchical image segmentation. IEEE transactions on pattern analysis and machine intelligence, 33(5), 898-916.
[20]Barton, M. J. (2001). Controller development and implementation for path planning and following in an autonomous urban vehicle. Undergraduate thesis, University of Sydney.
[21]Park, S., Deyst, J., & How, J. P. (2007). Performance and lyapunov stability of a nonlinear path following guidance method. Journal of Guidance, Control, and Dynamics, 30(6), 1718-1728.
[22]Kothari, M., Postlethwaite, I., & Gu, D. W. (2010). A Suboptimal Path Planning Algorithm Using Rapidly-exploring Random Trees. International Journal of Aerospace Innovations, 2.
[23]Ratnoo, A., Sujit, P. B., & Kothari, M. (2011, September). Adaptive optimal path following for high wind flights. In 18th International Federation of Automatic Control (IFAC) World Congress (pp. 12-985).
[24]Su, J. H., Lee, C. S., Huang, H. H., Chuang, S. H., & Lin, C. Y. (2010). An intelligent line-following robot project for introductory robot courses. World Transactions on Engineering and Technology Education, 8(4), 455-461.
[25]Khalife, J., Shamaei, K., Bhattacharya, S., & Kassas, Z. (2018, September). Centimeter-accurate UAV navigation with cellular signals. In Proceedings of ION GNSS Conference.
[26]Kothari, M., Postlethwaite, I., & Gu, D. W. (2014). UAV path following in windy urban environments. Journal of Intelligent & Robotic Systems, 74(3-4), 1013-1028.
[27]Sipser, M. (2006). Introduction to the Theory of Computation(Vol. 2). Boston: Thomson Course Technology.
[28]Alsalam, B. H. Y., Morton, K., Campbell, D., & Gonzalez, F. (2017, March). Autonomous UAV with vision based on-board decision making for remote sensing and precision agriculture. In Aerospace Conference, 2017 IEEE (pp. 1-12). IEEE.
[29]Puterman, M. L. (2014). Markov decision processes: discrete stochastic dynamic programming. John Wiley & Sons.
[30]Koenig, J., Malberg, S., Martens, M., Niehaus, S., Krohn-Grimberghe, A., & Ramaswamy, A. (2018). Multi-Stage Reinforcement Learning For Object Detection. arXiv preprint arXiv:1810.10325.
[31]Xiang, Y., Alahi, A., & Savarese, S. (2015). Learning to track: Online multi-object tracking by decision making. In Proceedings of the IEEE international conference on computer vision (pp. 4705-4713).
[32]Turchetta, M., Berkenkamp, F., & Krause, A. (2016). Safe exploration in finite markov decision processes with gaussian processes. In Advances in Neural Information Processing Systems (pp. 4312-4320).
[33]Ferreira, L. A., Bianchi, R. A., Santos, P. E., & de Mantaras, R. L. (2018, June). A method for the online construction of the set of states of a Markov Decision Process using Answer Set Programming. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems (pp. 3-15). Springer, Cham.
[34]Suprayitno, H., Ratnasari, V., & Saraswati, N. (2017, November). Experiment Design for Determining the Minimum Sample Size for Developing Sample Based Trip Length Distribution. In IOP Conference Series: Materials Science and Engineering (Vol. 267, No. 1, p. 012029). IOP Publishing.
[35]Parrot Bebop2 Power Technical specifications. (n.d.). Retrieved November 13, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2-power-pack-fpv#technicals.
[36]Bebop_autonomy. (n.d.). Retrieved November 13, 2018, from https://bebop-autonomy.readthedocs.io/en/latest.
zh_TW
dc.identifier.doi (DOI) 10.6814/THE.NCCU.CS.001.2019.B02en_US