學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 以深度學習建構蔬果辨識之知識模型
Constructing knowledge model of vegetable and fruit recognition with deep learning
作者 賴暘晟
Lai, Yang-Cheng
貢獻者 羅崇銘
Lo, Chung-Ming
賴暘晟
Lai, Yang-Cheng
關鍵詞 智慧農業
深度學習
影像辨識
蔬果影像
Smart agriculture
Deep learning
Image recognition
Fruit and vegetable images
日期 2021
上傳時間 2-Sep-2021 16:36:41 (UTC+8)
摘要 食物的需求將因世界總人口的成長而跟著成長,農作物的產量供給要能滿足人們的需求;但是勞動力不足會導致農作物無法全部種植與收成而導致產量降低,智慧農業以資訊與通訊科技來解決全世界農業的產量需求以及農民的品質競爭並取代人工勞動力,其中應用於蔬果的成熟與種類判斷的即是影像辨識技術,過去的影像辨識需要人為設計特徵,技術層次高且適用性低,本研究提出使用深度學習技術進行蔬果辨識,藉由大量影像的特徵學習達到建立知識模型的目的,實驗中收集多種蔬果的數位影像並以多種深度學習技術的卷積神經網路訓練,包括採用transfer learning與train from scratch訓練方式,結果呈現上以準確率、效率、參數多寡做為比較依據,在區分15種蔬果的知識模型上,基於transfer learning的SqueezeNet準確率為99.68%與train from scratch的Densenet201可達到準確率為99.25%,但是兩者訓練時間差距約14倍,在單一蔬果不同品種知識模型與相似顏色蔬果知識模型的表現上,train from scratch的Densenet201準確率優於transfer learning的SqueezeNet準確率,除此之外,將影像資料轉成灰階影像訓練的辨識結果有達到98%以上的準確率。目前蔬果辨識之知識模型只能辨識收集來的影像資料集場景,想要應用知識模型在辨識其他的場景,則必須用該場景的影像資料再次進行訓練,才能達到足夠理想的準確率。
The demand for food will grow due to the growth of the world`s total population. The output of crops must be able to meet people`s needs; but the shortage of labor will cause crops to be planted and harvested, resulting in a decrease in output. Smart agriculture is solved by information and communication technology. The world’s agricultural production needs and the quality of farmers’ competition to replace human labor. Among them, image recognition technology is used to determine the maturity and variety of fruits and vegetables. In the past, image recognition required artificial design features, with high technical levels and low applicability. The research proposes the use of deep learning technology for vegetable and fruit identification. The purpose of building a knowledge model is achieved through feature learning of a large number of images. In the experiment, a variety of digital images of vegetables and fruits are collected and convolutional neural network training with a variety of deep learning techniques, including the use of transfer learning Compared with the train from scratch training method, the results are presented on the basis of accuracy, efficiency, and the number of parameters. In the knowledge model for distinguishing 15 kinds of vegetables and fruits, the accuracy of SqueezeNet based on transfer learning is 99.68% and the accuracy of Densenet201 of train from scratch is comparable. The accuracy rate is 99.25%, but the training time gap between the two is about 14 times. In terms of the performance of the knowledge model of different varieties of vegetables and fruits and the knowledge model of similar colors, the accuracy of Densenet201 of train from scratch is better than the accuracy of SqueezeNet of transfer learning. In addition, the recognition result of the training of converting the image data into gray-scale images has an accuracy rate of over 98%. The current knowledge model of vegetable and fruit identification can only identify the scenes of the collected image data set. If you want to apply the knowledge model to identify other scenes, you must use the image data of the scene to perform training again to achieve a sufficiently ideal accuracy.
參考文獻 [1] S. Sharma, R. Shandilya, U. S. Tim, and J. Wong, "eFeed-Hungers.com: Mitigating global hunger crisis using next generation technologies," Telematics and Informatics, vol. 35, no. 2, pp. 446-456, 2018/05/01/ 2018, doi: https://doi.org/10.1016/j.tele.2018.01.003.
[2] "Selected Results of the 2019 UN World Population Projections," Population and Development Review, vol. 45, no. 3, pp. 689-694, 2019, doi: https://doi.org/10.1111/padr.12288.
[3] J. A. Foley et al., "Solutions for a cultivated planet," Nature, vol. 478, no. 7369, pp. 337-342, 2011/10/01 2011, doi: 10.1038/nature10452.
[4] J. Busch and J. Engelmann, "Cost-effectiveness of reducing emissions from tropical deforestation, 2016–2050," Environmental Research Letters, vol. 13, no. 1, p. 015001, 2017/12/01 2017, doi: 10.1088/1748-9326/aa907c.
[5] P. Friedlingstein et al., "Update on CO2 emissions," Nature Geoscience, vol. 3, no. 12, pp. 811-812, 2010/12/01 2010, doi: 10.1038/ngeo1022.
[6] R. DeFries and C. Rosenzweig, "Toward a whole-landscape approach for sustainable land use in the tropics," Proceedings of the National Academy of Sciences, vol. 107, no. 46, p. 19627, 2010, doi: 10.1073/pnas.1011163107.
[7] J. Poesen, "Soil erosion in the Anthropocene: Research needs," Earth Surface Processes and Landforms, vol. 43, no. 1, pp. 64-84, 2018, doi: 10.1002/esp.4250.
[8] M. Burke and K. Emerick, "Adaptation to Climate Change: Evidence from US Agriculture," (in English), Am. Econ. J.-Econ. Policy, Article vol. 8, no. 3, pp. 106-140, Aug 2016, doi: 10.1257/pol.20130025.
[9] T. Vanwalleghem et al., "Impact of historical land use and soil management change on soil erosion and agricultural sustainability during the Anthropocene," Anthropocene, vol. 17, pp. 13-29, 2017/03/01/ 2017, doi: https://doi.org/10.1016/j.ancene.2017.01.002.
[10] W. Ouyang, Y. Wu, Z. Hao, Q. Zhang, Q. Bu, and X. Gao, "Combined impacts of land use and soil property changes on soil erosion in a mollisol area under long-term agricultural development," Science of The Total Environment, vol. 613-614, pp. 798-809, 2018/02/01/ 2018, doi: https://doi.org/10.1016/j.scitotenv.2017.09.173.
[11] J. P. Vasconez, G. A. Kantor, and F. A. A. Cheein, "Human-robot interaction in agriculture: A survey and current challenges," (in English), Biosyst. Eng., Review vol. 179, pp. 35-48, Mar 2019, doi: 10.1016/j.biosystemseng.2018.12.005.
[12] J. B. Samtani et al., "The Status and Future of the Strawberry Industry in the United States," (in English), HortTechnology hortte, vol. 29, no. 1, p. 11, 01 Feb. 2019 2019, doi: 10.21273/horttech04135-18.
[13] J. Guthman, "Paradoxes of the Border: Labor Shortages and Farmworker Minor Agency in Reworking California`s Strawberry Fields," (in English), Econ. Geogr., Article vol. 93, no. 1, pp. 24-43, Jan 2017, doi: 10.1080/00130095.2016.1180241.
[14] B. o. L. Statistics.(2020. Access Date) "Farming, Fishing, and Forestry Occupations." Occupational Employment Statistics. Available: https://www.bls.gov/oes/current/oes450000.htm#
[15] X. Zheng, "Drawing upon the Experience and Lessons of the Netherlands and Japan to Accelerate Agricultural Modernization," in China’s 40 Years of Economic Reform and Development: How the Miracle Was Created. Singapore: Springer Singapore, 2018, pp. 315-319.
[16] J.-H. Chuang, J.-H. Wang, and Y.-C. Liou, "Farmers’ Knowledge, Attitude, and Adoption of Smart Agriculture Technology in Taiwan," Int. J. Environ. Res. Public Health, vol. 17, no. 19, p. 7236, 2020. [Online]. Available: https://www.mdpi.com/1660-4601/17/19/7236.
[17] C. A. Damalas and I. G. Eleftherohorinos, "Pesticide Exposure, Safety Issues, and Risk Assessment Indicators," (in English), Int. J. Environ. Res. Public Health, Review vol. 8, no. 5, pp. 1402-1419, May 2011, doi: 10.3390/ijerph8051402.
[18] L. P. Rocha, M. R. Cezar-Vaz, M. C. V. de Almeida, D. R. Piexak, and C. A. Bonow, "Association between pain and agricultural workload," (in English), Acta Paul. Enferm., Article vol. 27, no. 4, pp. 333-339, 2014, doi: 10.1590/1982-0194201400056.
[19] M. S. Farooq, S. Riaz, A. Abid, K. Abid, and M. A. Naeem, "A Survey on the Role of IoT in Agriculture for the Implementation of Smart Farming," IEEE Access, vol. 7, pp. 156237-156271, 2019, doi: 10.1109/ACCESS.2019.2949703.
[20] Ö. Köksal and B. Tekinerdogan, "Architecture design approach for IoT-based farm management information systems," Precision Agriculture, vol. 20, no. 5, pp. 926-958, 2019/10/01 2019, doi: 10.1007/s11119-018-09624-8.
[21] A. Triantafyllou, D. C. Tsouros, P. Sarigiannidis, and S. Bibi, "An Architecture model for Smart Farming," in 2019 15th International Conference on Distributed Computing in Sensor Systems (DCOSS), 29-31 May 2019 2019, pp. 385-392, doi: 10.1109/DCOSS.2019.00081.
[22] F. Pallottino et al., "Machine Vision Retrofit System for Mechanical Weed Control in Precision Agriculture Applications," (in English), Sustainability, Article vol. 10, no. 7, p. 9, Jul 2018, Art no. 2209, doi: 10.3390/su10072209.
[23] S. C. Li, L. D. Xu, and S. S. Zhao, "The internet of things: a survey," (in English), Inf. Syst. Front., Article vol. 17, no. 2, pp. 243-259, Apr 2015, doi: 10.1007/s10796-014-9492-7.
[24] K. Chang, P. Liu, Z. Kuo, and S. Liao, "Design of persimmon growing stage monitoring system using image recognition technique," in 2016 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), 27-29 May 2016 2016, pp. 1-2, doi: 10.1109/ICCE-TW.2016.7520978.
[25] T. Lewis, "Evolution of farm management information systems," (in English), Comput. Electron. Agric., Article vol. 19, no. 3, pp. 233-248, Mar 1998, doi: 10.1016/s0168-1699(97)00040-9.
[26] G. N. Lu, M. Batty, J. Strobl, H. Lin, A. X. Zhu, and M. Chen, "Reflections and speculations on the progress in Geographic Information Systems (GIS): a geographic perspective," (in English), INTERNATIONAL JOURNAL OF GEOGRAPHICAL INFORMATION SCIENCE, vol. 33, no. 2, pp. 346-367, FEB 1 2019, doi: 10.1080/13658816.2018.1533136.
[27] R. H. Sprague, "A Framework for the Development of Decision Support Systems," MIS Quarterly, vol. 4, no. 4, pp. 1-26, 1980, doi: 10.2307/248957.
[28] F. Y. Narvaez, G. Reina, M. Torres-Torriti, G. Kantor, and F. A. Cheein, "A Survey of Ranging and Imaging Techniques for Precision Agriculture Phenotyping," IEEE/ASME Transactions on Mechatronics, vol. 22, no. 6, pp. 2428-2439, 2017, doi: 10.1109/TMECH.2017.2760866.
[29] B. A. Aubert, A. Schroeder, and J. Grimaudo, "IT as enabler of sustainable farming: An empirical analysis of farmers` adoption decision of precision agriculture technology," Decision Support Systems, vol. 54, no. 1, pp. 510-520, 2012/12/01/ 2012, doi: https://doi.org/10.1016/j.dss.2012.07.002.
[30] J. Clapp and S. L. Ruder, "Precision Technologies for Agriculture: Digital Farming, Gene-Edited Crops, and the Politics of Sustainability," (in English), Glob. Environ. Polit., Article vol. 20, no. 3, pp. 49-69, Aug 2020, doi: 10.1162/glep_a_00566.
[31] A. King, "Technology: The Future of Agriculture," Nature, vol. 544, no. 7651, pp. S21-S23, 2017/04/01 2017, doi: 10.1038/544S21a.
[32] V. Subramanian, T. F. Burks, and A. A. Arroyo, "Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation," Comput. Electron. Agric., vol. 53, no. 2, pp. 130-143, 2006/09/01/ 2006, doi: https://doi.org/10.1016/j.compag.2006.06.001.
[33] H. Mousazadeh, "A technical review on navigation systems of agricultural autonomous off-road vehicles," Journal of Terramechanics, vol. 50, no. 3, pp. 211-232, 2013/06/01/ 2013, doi: https://doi.org/10.1016/j.jterra.2013.03.004.
[34] V. Marinoudi, C. G. Sorensen, S. Pearson, and D. Bochtis, "Robotics and labour in agriculture. A context consideration," (in English), Biosyst. Eng., Article vol. 184, pp. 111-121, Aug 2019, doi: 10.1016/j.biosystemseng.2019.06.013.
[35] P. S. S, K. Malarvizhi, S. Karthik, and M. G. S.G, "Machine Learning and Internet of Things based Smart Agriculture," in 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), 6-7 March 2020 2020, pp. 1101-1106, doi: 10.1109/ICACCS48705.2020.9074472.
[36] D. Wang, C. Li, H. Song, H. Xiong, C. Liu, and D. He, "Deep Learning Approach for Apple Edge Detection to Remotely Monitor Apple Growth in Orchards," IEEE Access, vol. 8, pp. 26911-26925, 2020, doi: 10.1109/ACCESS.2020.2971524.
[37] J. D. Pujari, R. Yakkundimath, and A. S. Byadgi, "Identification and classification of fungal disease affected on agriculture/horticulture crops using image processing techniques," in 2014 IEEE International Conference on Computational Intelligence and Computing Research, 18-20 Dec. 2014 2014, pp. 1-4, doi: 10.1109/ICCIC.2014.7238283.
[38] M. S. Hossain, M. Al-Hammadi, and G. Muhammad, "Automatic Fruit Classification Using Deep Learning for Industrial Applications," IEEE Transactions on Industrial Informatics, vol. 15, no. 2, pp. 1027-1034, 2019, doi: 10.1109/TII.2018.2875149.
[39] M. Recce, J. Taylor, A. Plebe, and G. Tropiano, "Vision and neural control for an orange harvesting robot," in Proceedings of International Workshop on Neural Networks for Identification, Control, Robotics and Signal/Image Processing, 21-23 Aug. 1996 1996, pp. 467-475, doi: 10.1109/NICRSP.1996.542791.
[40] R. Ceres, J. L. Pons, A. R. Jiménez, J. M. Martín, and L. Calderón, "Design and implementation of an aided fruit‐harvesting robot (Agribot)," Industrial Robot: An International Journal, vol. 25, no. 5, pp. 337-346, 1998, doi: 10.1108/01439919810232440.
[41] K. Tanigaki, T. Fujiura, A. Akase, and J. Imagawa, "Cherry-harvesting robot," Comput. Electron. Agric., vol. 63, no. 1, pp. 65-72, 2008/08/01/ 2008, doi: https://doi.org/10.1016/j.compag.2008.01.018.
[42] N. Irie, N. Taguchi, T. Horie, and T. Ishimatsu, "Asparagus harvesting robot coordinated with 3-D vision sensor," in 2009 IEEE International Conference on Industrial Technology, 10-13 Feb. 2009 2009, pp. 1-6, doi: 10.1109/ICIT.2009.4939556.
[43] S. Hayashi et al., "Field Operation of a Movable Strawberry-harvesting Robot using a Travel Platform," Japan Agricultural Research Quarterly: JARQ, vol. 48, no. 3, pp. 307-316, 2014, doi: 10.6090/jarq.48.307.
[44] B. Arad et al., "Development of a sweet pepper harvesting robot," Journal of Field Robotics, vol. 37, no. 6, pp. 1027-1039, 2020, doi: 10.1002/rob.21937.
[45] A. Bhargava and A. Bansal, "Fruits and vegetables quality evaluation using computer vision: A review," Journal of King Saud University - Computer and Information Sciences, 2018/06/05/ 2018, doi: https://doi.org/10.1016/j.jksuci.2018.06.002.
[46] F. Raponi, R. Moscetti, D. Monarca, A. Colantoni, and R. Massantini, "Monitoring and Optimization of the Process of Drying Fruits and Vegetables Using Computer Vision: A Review," (in English), Sustainability, Review vol. 9, no. 11, p. 27, Nov 2017, Art no. 2009, doi: 10.3390/su9112009.
[47] G. Romano, M. Nagle, and J. Müller, "Two-parameter Lorentzian distribution for monitoring physical parameters of golden colored fruits during drying by application of laser light in the Vis/NIR spectrum," Innovative Food Science & Emerging Technologies, vol. 33, pp. 498-505, 2016/02/01/ 2016, doi: https://doi.org/10.1016/j.ifset.2015.11.007.
[48] J. L. Rojas-Aranda, J. I. Nunez-Varela, J. C. Cuevas-Tello, and G. Rangel-Ramirez, "Fruit Classification for Retail Stores Using Deep Learning," in Pattern Recognition, Cham, K. M. Figueroa Mora, J. Anzurez Marín, J. Cerda, J. A. Carrasco-Ochoa, J. F. Martínez-Trinidad, and J. A. Olvera-López, Eds., 2020// 2020: Springer International Publishing, pp. 3-13.
[49] Y. Osako, H. Yamane, S.-Y. Lin, P.-A. Chen, and R. Tao, "Cultivar discrimination of litchi fruit images using deep learning," Scientia Horticulturae, vol. 269, p. 109360, 2020/07/27/ 2020, doi: https://doi.org/10.1016/j.scienta.2020.109360.
[50] J.-R. Xiao, P.-C. Chung, H.-Y. Wu, Q.-H. Phan, J.-L. A. Yeh, and M. T. Hou, "Detection of Strawberry Diseases Using a Convolutional Neural Network," Plants, vol. 10, no. 1, 2021, doi: 10.3390/plants10010031.
[51] S. Dargan, M. Kumar, M. R. Ayyagari, and G. Kumar, "A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning," Archives of Computational Methods in Engineering, vol. 27, no. 4, pp. 1071-1092, 2020/09/01 2020, doi: 10.1007/s11831-019-09344-w.
[52] S. K. Behera, A. K. Rath, and P. K. Sethy, "Maturity status classification of papaya fruits based on machine learning and transfer learning approach," Information Processing in Agriculture, 2020/05/20/ 2020, doi: https://doi.org/10.1016/j.inpa.2020.05.003.
[53] M. Jogin, Mohana, M. S. Madhulika, G. D. Divya, R. K. Meghana, and S. Apoorva, "Feature Extraction using Convolution Neural Networks (CNN) and Deep Learning," in 2018 3rd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), 18-19 May 2018 2018, pp. 2319-2323, doi: 10.1109/RTEICT42901.2018.9012507.
[54] R. Kline, "Cybernetics, automata studies, and the Dartmouth conference on artificial intelligence," IEEE Annals of the History of Computing, vol. 33, no. 4, pp. 5-16, 2010.
[55] U. Gasser and V. A. F. Almeida, "A Layered Model for AI Governance," IEEE Internet Computing, vol. 21, no. 6, pp. 58-62, 2017, doi: 10.1109/MIC.2017.4180835.
[56] N. Kriegeskorte and T. Golan, "Neural network models and deep learning," Current Biology, vol. 29, no. 7, pp. R231-R236, 2019/04/01/ 2019, doi: https://doi.org/10.1016/j.cub.2019.02.034.
[57] S. Sengupta et al., "A review of deep learning with special emphasis on architectures, applications and recent trends," (in English), Knowledge-Based Syst., Review vol. 194, p. 33, Apr 2020, Art no. 105596, doi: 10.1016/j.knosys.2020.105596.
[58] Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, vol. 521, no. 7553, pp. 436-444, 2015/05/01 2015, doi: 10.1038/nature14539.
[59] A. Krizhevsky, I. Sutskever, and G. Hinton, "ImageNet Classification with Deep Convolutional Neural Networks," Neural Information Processing Systems, vol. 25, 01/01 2012, doi: 10.1145/3065386.
[60] C. Szegedy et al., "Going deeper with convolutions," in 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 7-12 June 2015 2015, pp. 1-9, doi: 10.1109/CVPR.2015.7298594.
[61] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the Inception Architecture for Computer Vision," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 27-30 June 2016 2016, pp. 2818-2826, doi: 10.1109/CVPR.2016.308.
[62] K. He, X. Zhang, S. Ren, and J. Sun, "Deep Residual Learning for Image Recognition," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 27-30 June 2016 2016, pp. 770-778, doi: 10.1109/CVPR.2016.90.
[63] G. Huang, Z. Liu, L. V. D. Maaten, and K. Q. Weinberger, "Densely Connected Convolutional Networks," in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 21-26 July 2017 2017, pp. 2261-2269, doi: 10.1109/CVPR.2017.243.
[64] F. N. Iandola, S. Han, M. W. Moskewicz, K. Ashraf, W. J. Dally, and K. Keutzer, "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 0.5 MB model size," arXiv preprint arXiv:1602.07360, 2016.
[65] V. Snow et al., "Resilience achieved via multiple compensating subsystems: The immediate impacts of COVID-19 control measures on the agri-food systems of Australia and New Zealand," Agricultural Systems, vol. 187, p. 103025, 2021/02/01/ 2021, doi: https://doi.org/10.1016/j.agsy.2020.103025.
[66] N. Morris.(2021. Access Date) "Fruit and vegetable growers` mental health declines with no end in sight to worker shortage, new data reveals." ABC NEWS. Available: https://www.abc.net.au/news/rural/2021-02-04/worker-shortage-causes-grower-mental-health-decline/13104084
[67] Y. Zhuang.(2021. Access Date) "Without Backpackers to Pick Them, Crops Rot by the Ton in Australia." The New York Times. Available: https://www.nytimes.com/2021/03/02/world/australia/agriculture-backpackers.html
[68] H. K. Wu, J. S. Wang, and Y. H. Chen, "Development of Fruit Grading System Based on Image Recognition," in 2020 IEEE 2nd International Conference on Architecture, Construction, Environment and Hydraulics (ICACEH), 25-27 Dec. 2020 2020, pp. 26-27, doi: 10.1109/ICACEH51803.2020.9366224.
[69] M. E. Karar, F. Alsunaydi, S. Albusaymi, and S. Alotaibi, "A new mobile application of agricultural pests recognition using deep learning in cloud computing system," Alexandria Engineering Journal, vol. 60, no. 5, pp. 4423-4432, 2021/10/01/ 2021, doi: https://doi.org/10.1016/j.aej.2021.03.009.
[70] H.-W. Liu, C.-H. Chen, Y.-C. Tsai, K.-W. Hsieh, and H.-T. Lin, "Identifying Images of Dead Chickens with a Chicken Removal System Integrated with a Deep Learning Algorithm," Sensors, vol. 21, no. 11, p. 3579, 2021. [Online]. Available: https://www.mdpi.com/1424-8220/21/11/3579.
[71] Y. R. Chen et al., "An AI-based System for Monitoring Behavior and Growth of Pigs," in 2020 International Computer Symposium (ICS), 17-19 Dec. 2020 2020, pp. 91-95, doi: 10.1109/ICS51289.2020.00027.
[72] I. Huang et al., "The Prototype of a Smart Underwater Surveillance System for Shrimp Farming," in 2018 IEEE International Conference on Advanced Manufacturing (ICAM), 16-18 Nov. 2018 2018, pp. 177-180, doi: 10.1109/AMCON.2018.8614976.
描述 碩士
國立政治大學
圖書資訊與檔案學研究所
108155021
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0108155021
資料類型 thesis
dc.contributor.advisor 羅崇銘zh_TW
dc.contributor.advisor Lo, Chung-Mingen_US
dc.contributor.author (Authors) 賴暘晟zh_TW
dc.contributor.author (Authors) Lai, Yang-Chengen_US
dc.creator (作者) 賴暘晟zh_TW
dc.creator (作者) Lai, Yang-Chengen_US
dc.date (日期) 2021en_US
dc.date.accessioned 2-Sep-2021 16:36:41 (UTC+8)-
dc.date.available 2-Sep-2021 16:36:41 (UTC+8)-
dc.date.issued (上傳時間) 2-Sep-2021 16:36:41 (UTC+8)-
dc.identifier (Other Identifiers) G0108155021en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/136929-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 圖書資訊與檔案學研究所zh_TW
dc.description (描述) 108155021zh_TW
dc.description.abstract (摘要) 食物的需求將因世界總人口的成長而跟著成長,農作物的產量供給要能滿足人們的需求;但是勞動力不足會導致農作物無法全部種植與收成而導致產量降低,智慧農業以資訊與通訊科技來解決全世界農業的產量需求以及農民的品質競爭並取代人工勞動力,其中應用於蔬果的成熟與種類判斷的即是影像辨識技術,過去的影像辨識需要人為設計特徵,技術層次高且適用性低,本研究提出使用深度學習技術進行蔬果辨識,藉由大量影像的特徵學習達到建立知識模型的目的,實驗中收集多種蔬果的數位影像並以多種深度學習技術的卷積神經網路訓練,包括採用transfer learning與train from scratch訓練方式,結果呈現上以準確率、效率、參數多寡做為比較依據,在區分15種蔬果的知識模型上,基於transfer learning的SqueezeNet準確率為99.68%與train from scratch的Densenet201可達到準確率為99.25%,但是兩者訓練時間差距約14倍,在單一蔬果不同品種知識模型與相似顏色蔬果知識模型的表現上,train from scratch的Densenet201準確率優於transfer learning的SqueezeNet準確率,除此之外,將影像資料轉成灰階影像訓練的辨識結果有達到98%以上的準確率。目前蔬果辨識之知識模型只能辨識收集來的影像資料集場景,想要應用知識模型在辨識其他的場景,則必須用該場景的影像資料再次進行訓練,才能達到足夠理想的準確率。zh_TW
dc.description.abstract (摘要) The demand for food will grow due to the growth of the world`s total population. The output of crops must be able to meet people`s needs; but the shortage of labor will cause crops to be planted and harvested, resulting in a decrease in output. Smart agriculture is solved by information and communication technology. The world’s agricultural production needs and the quality of farmers’ competition to replace human labor. Among them, image recognition technology is used to determine the maturity and variety of fruits and vegetables. In the past, image recognition required artificial design features, with high technical levels and low applicability. The research proposes the use of deep learning technology for vegetable and fruit identification. The purpose of building a knowledge model is achieved through feature learning of a large number of images. In the experiment, a variety of digital images of vegetables and fruits are collected and convolutional neural network training with a variety of deep learning techniques, including the use of transfer learning Compared with the train from scratch training method, the results are presented on the basis of accuracy, efficiency, and the number of parameters. In the knowledge model for distinguishing 15 kinds of vegetables and fruits, the accuracy of SqueezeNet based on transfer learning is 99.68% and the accuracy of Densenet201 of train from scratch is comparable. The accuracy rate is 99.25%, but the training time gap between the two is about 14 times. In terms of the performance of the knowledge model of different varieties of vegetables and fruits and the knowledge model of similar colors, the accuracy of Densenet201 of train from scratch is better than the accuracy of SqueezeNet of transfer learning. In addition, the recognition result of the training of converting the image data into gray-scale images has an accuracy rate of over 98%. The current knowledge model of vegetable and fruit identification can only identify the scenes of the collected image data set. If you want to apply the knowledge model to identify other scenes, you must use the image data of the scene to perform training again to achieve a sufficiently ideal accuracy.en_US
dc.description.tableofcontents 謝辭 ii
摘要 iii
Abstract iv
目次 v
圖目次 vi
表目次 viii
第一章 緒論 1
第一節 世界農業現況 1
第二節 智慧農業 2
第三節 蔬果影像辨識 7
第二章 文獻探討 9
第一節 國外蔬果影像辨識 9
第二節 國內蔬果影像辨識 14
第三節 文獻探討總結 15
第三章 材料與方法 16
第一節 三種知識模型 21
第二節 卷積神經網路 28
第四章 實驗結果 38
第一節 訓練與驗證 38
第二節 測試影像資料集 49
第五章 結論與未來方向 51
第一節 結論 51
第二節 未來方向 51
參考文獻 55
zh_TW
dc.format.extent 3821367 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0108155021en_US
dc.subject (關鍵詞) 智慧農業zh_TW
dc.subject (關鍵詞) 深度學習zh_TW
dc.subject (關鍵詞) 影像辨識zh_TW
dc.subject (關鍵詞) 蔬果影像zh_TW
dc.subject (關鍵詞) Smart agricultureen_US
dc.subject (關鍵詞) Deep learningen_US
dc.subject (關鍵詞) Image recognitionen_US
dc.subject (關鍵詞) Fruit and vegetable imagesen_US
dc.title (題名) 以深度學習建構蔬果辨識之知識模型zh_TW
dc.title (題名) Constructing knowledge model of vegetable and fruit recognition with deep learningen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1] S. Sharma, R. Shandilya, U. S. Tim, and J. Wong, "eFeed-Hungers.com: Mitigating global hunger crisis using next generation technologies," Telematics and Informatics, vol. 35, no. 2, pp. 446-456, 2018/05/01/ 2018, doi: https://doi.org/10.1016/j.tele.2018.01.003.
[2] "Selected Results of the 2019 UN World Population Projections," Population and Development Review, vol. 45, no. 3, pp. 689-694, 2019, doi: https://doi.org/10.1111/padr.12288.
[3] J. A. Foley et al., "Solutions for a cultivated planet," Nature, vol. 478, no. 7369, pp. 337-342, 2011/10/01 2011, doi: 10.1038/nature10452.
[4] J. Busch and J. Engelmann, "Cost-effectiveness of reducing emissions from tropical deforestation, 2016–2050," Environmental Research Letters, vol. 13, no. 1, p. 015001, 2017/12/01 2017, doi: 10.1088/1748-9326/aa907c.
[5] P. Friedlingstein et al., "Update on CO2 emissions," Nature Geoscience, vol. 3, no. 12, pp. 811-812, 2010/12/01 2010, doi: 10.1038/ngeo1022.
[6] R. DeFries and C. Rosenzweig, "Toward a whole-landscape approach for sustainable land use in the tropics," Proceedings of the National Academy of Sciences, vol. 107, no. 46, p. 19627, 2010, doi: 10.1073/pnas.1011163107.
[7] J. Poesen, "Soil erosion in the Anthropocene: Research needs," Earth Surface Processes and Landforms, vol. 43, no. 1, pp. 64-84, 2018, doi: 10.1002/esp.4250.
[8] M. Burke and K. Emerick, "Adaptation to Climate Change: Evidence from US Agriculture," (in English), Am. Econ. J.-Econ. Policy, Article vol. 8, no. 3, pp. 106-140, Aug 2016, doi: 10.1257/pol.20130025.
[9] T. Vanwalleghem et al., "Impact of historical land use and soil management change on soil erosion and agricultural sustainability during the Anthropocene," Anthropocene, vol. 17, pp. 13-29, 2017/03/01/ 2017, doi: https://doi.org/10.1016/j.ancene.2017.01.002.
[10] W. Ouyang, Y. Wu, Z. Hao, Q. Zhang, Q. Bu, and X. Gao, "Combined impacts of land use and soil property changes on soil erosion in a mollisol area under long-term agricultural development," Science of The Total Environment, vol. 613-614, pp. 798-809, 2018/02/01/ 2018, doi: https://doi.org/10.1016/j.scitotenv.2017.09.173.
[11] J. P. Vasconez, G. A. Kantor, and F. A. A. Cheein, "Human-robot interaction in agriculture: A survey and current challenges," (in English), Biosyst. Eng., Review vol. 179, pp. 35-48, Mar 2019, doi: 10.1016/j.biosystemseng.2018.12.005.
[12] J. B. Samtani et al., "The Status and Future of the Strawberry Industry in the United States," (in English), HortTechnology hortte, vol. 29, no. 1, p. 11, 01 Feb. 2019 2019, doi: 10.21273/horttech04135-18.
[13] J. Guthman, "Paradoxes of the Border: Labor Shortages and Farmworker Minor Agency in Reworking California`s Strawberry Fields," (in English), Econ. Geogr., Article vol. 93, no. 1, pp. 24-43, Jan 2017, doi: 10.1080/00130095.2016.1180241.
[14] B. o. L. Statistics.(2020. Access Date) "Farming, Fishing, and Forestry Occupations." Occupational Employment Statistics. Available: https://www.bls.gov/oes/current/oes450000.htm#
[15] X. Zheng, "Drawing upon the Experience and Lessons of the Netherlands and Japan to Accelerate Agricultural Modernization," in China’s 40 Years of Economic Reform and Development: How the Miracle Was Created. Singapore: Springer Singapore, 2018, pp. 315-319.
[16] J.-H. Chuang, J.-H. Wang, and Y.-C. Liou, "Farmers’ Knowledge, Attitude, and Adoption of Smart Agriculture Technology in Taiwan," Int. J. Environ. Res. Public Health, vol. 17, no. 19, p. 7236, 2020. [Online]. Available: https://www.mdpi.com/1660-4601/17/19/7236.
[17] C. A. Damalas and I. G. Eleftherohorinos, "Pesticide Exposure, Safety Issues, and Risk Assessment Indicators," (in English), Int. J. Environ. Res. Public Health, Review vol. 8, no. 5, pp. 1402-1419, May 2011, doi: 10.3390/ijerph8051402.
[18] L. P. Rocha, M. R. Cezar-Vaz, M. C. V. de Almeida, D. R. Piexak, and C. A. Bonow, "Association between pain and agricultural workload," (in English), Acta Paul. Enferm., Article vol. 27, no. 4, pp. 333-339, 2014, doi: 10.1590/1982-0194201400056.
[19] M. S. Farooq, S. Riaz, A. Abid, K. Abid, and M. A. Naeem, "A Survey on the Role of IoT in Agriculture for the Implementation of Smart Farming," IEEE Access, vol. 7, pp. 156237-156271, 2019, doi: 10.1109/ACCESS.2019.2949703.
[20] Ö. Köksal and B. Tekinerdogan, "Architecture design approach for IoT-based farm management information systems," Precision Agriculture, vol. 20, no. 5, pp. 926-958, 2019/10/01 2019, doi: 10.1007/s11119-018-09624-8.
[21] A. Triantafyllou, D. C. Tsouros, P. Sarigiannidis, and S. Bibi, "An Architecture model for Smart Farming," in 2019 15th International Conference on Distributed Computing in Sensor Systems (DCOSS), 29-31 May 2019 2019, pp. 385-392, doi: 10.1109/DCOSS.2019.00081.
[22] F. Pallottino et al., "Machine Vision Retrofit System for Mechanical Weed Control in Precision Agriculture Applications," (in English), Sustainability, Article vol. 10, no. 7, p. 9, Jul 2018, Art no. 2209, doi: 10.3390/su10072209.
[23] S. C. Li, L. D. Xu, and S. S. Zhao, "The internet of things: a survey," (in English), Inf. Syst. Front., Article vol. 17, no. 2, pp. 243-259, Apr 2015, doi: 10.1007/s10796-014-9492-7.
[24] K. Chang, P. Liu, Z. Kuo, and S. Liao, "Design of persimmon growing stage monitoring system using image recognition technique," in 2016 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), 27-29 May 2016 2016, pp. 1-2, doi: 10.1109/ICCE-TW.2016.7520978.
[25] T. Lewis, "Evolution of farm management information systems," (in English), Comput. Electron. Agric., Article vol. 19, no. 3, pp. 233-248, Mar 1998, doi: 10.1016/s0168-1699(97)00040-9.
[26] G. N. Lu, M. Batty, J. Strobl, H. Lin, A. X. Zhu, and M. Chen, "Reflections and speculations on the progress in Geographic Information Systems (GIS): a geographic perspective," (in English), INTERNATIONAL JOURNAL OF GEOGRAPHICAL INFORMATION SCIENCE, vol. 33, no. 2, pp. 346-367, FEB 1 2019, doi: 10.1080/13658816.2018.1533136.
[27] R. H. Sprague, "A Framework for the Development of Decision Support Systems," MIS Quarterly, vol. 4, no. 4, pp. 1-26, 1980, doi: 10.2307/248957.
[28] F. Y. Narvaez, G. Reina, M. Torres-Torriti, G. Kantor, and F. A. Cheein, "A Survey of Ranging and Imaging Techniques for Precision Agriculture Phenotyping," IEEE/ASME Transactions on Mechatronics, vol. 22, no. 6, pp. 2428-2439, 2017, doi: 10.1109/TMECH.2017.2760866.
[29] B. A. Aubert, A. Schroeder, and J. Grimaudo, "IT as enabler of sustainable farming: An empirical analysis of farmers` adoption decision of precision agriculture technology," Decision Support Systems, vol. 54, no. 1, pp. 510-520, 2012/12/01/ 2012, doi: https://doi.org/10.1016/j.dss.2012.07.002.
[30] J. Clapp and S. L. Ruder, "Precision Technologies for Agriculture: Digital Farming, Gene-Edited Crops, and the Politics of Sustainability," (in English), Glob. Environ. Polit., Article vol. 20, no. 3, pp. 49-69, Aug 2020, doi: 10.1162/glep_a_00566.
[31] A. King, "Technology: The Future of Agriculture," Nature, vol. 544, no. 7651, pp. S21-S23, 2017/04/01 2017, doi: 10.1038/544S21a.
[32] V. Subramanian, T. F. Burks, and A. A. Arroyo, "Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation," Comput. Electron. Agric., vol. 53, no. 2, pp. 130-143, 2006/09/01/ 2006, doi: https://doi.org/10.1016/j.compag.2006.06.001.
[33] H. Mousazadeh, "A technical review on navigation systems of agricultural autonomous off-road vehicles," Journal of Terramechanics, vol. 50, no. 3, pp. 211-232, 2013/06/01/ 2013, doi: https://doi.org/10.1016/j.jterra.2013.03.004.
[34] V. Marinoudi, C. G. Sorensen, S. Pearson, and D. Bochtis, "Robotics and labour in agriculture. A context consideration," (in English), Biosyst. Eng., Article vol. 184, pp. 111-121, Aug 2019, doi: 10.1016/j.biosystemseng.2019.06.013.
[35] P. S. S, K. Malarvizhi, S. Karthik, and M. G. S.G, "Machine Learning and Internet of Things based Smart Agriculture," in 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), 6-7 March 2020 2020, pp. 1101-1106, doi: 10.1109/ICACCS48705.2020.9074472.
[36] D. Wang, C. Li, H. Song, H. Xiong, C. Liu, and D. He, "Deep Learning Approach for Apple Edge Detection to Remotely Monitor Apple Growth in Orchards," IEEE Access, vol. 8, pp. 26911-26925, 2020, doi: 10.1109/ACCESS.2020.2971524.
[37] J. D. Pujari, R. Yakkundimath, and A. S. Byadgi, "Identification and classification of fungal disease affected on agriculture/horticulture crops using image processing techniques," in 2014 IEEE International Conference on Computational Intelligence and Computing Research, 18-20 Dec. 2014 2014, pp. 1-4, doi: 10.1109/ICCIC.2014.7238283.
[38] M. S. Hossain, M. Al-Hammadi, and G. Muhammad, "Automatic Fruit Classification Using Deep Learning for Industrial Applications," IEEE Transactions on Industrial Informatics, vol. 15, no. 2, pp. 1027-1034, 2019, doi: 10.1109/TII.2018.2875149.
[39] M. Recce, J. Taylor, A. Plebe, and G. Tropiano, "Vision and neural control for an orange harvesting robot," in Proceedings of International Workshop on Neural Networks for Identification, Control, Robotics and Signal/Image Processing, 21-23 Aug. 1996 1996, pp. 467-475, doi: 10.1109/NICRSP.1996.542791.
[40] R. Ceres, J. L. Pons, A. R. Jiménez, J. M. Martín, and L. Calderón, "Design and implementation of an aided fruit‐harvesting robot (Agribot)," Industrial Robot: An International Journal, vol. 25, no. 5, pp. 337-346, 1998, doi: 10.1108/01439919810232440.
[41] K. Tanigaki, T. Fujiura, A. Akase, and J. Imagawa, "Cherry-harvesting robot," Comput. Electron. Agric., vol. 63, no. 1, pp. 65-72, 2008/08/01/ 2008, doi: https://doi.org/10.1016/j.compag.2008.01.018.
[42] N. Irie, N. Taguchi, T. Horie, and T. Ishimatsu, "Asparagus harvesting robot coordinated with 3-D vision sensor," in 2009 IEEE International Conference on Industrial Technology, 10-13 Feb. 2009 2009, pp. 1-6, doi: 10.1109/ICIT.2009.4939556.
[43] S. Hayashi et al., "Field Operation of a Movable Strawberry-harvesting Robot using a Travel Platform," Japan Agricultural Research Quarterly: JARQ, vol. 48, no. 3, pp. 307-316, 2014, doi: 10.6090/jarq.48.307.
[44] B. Arad et al., "Development of a sweet pepper harvesting robot," Journal of Field Robotics, vol. 37, no. 6, pp. 1027-1039, 2020, doi: 10.1002/rob.21937.
[45] A. Bhargava and A. Bansal, "Fruits and vegetables quality evaluation using computer vision: A review," Journal of King Saud University - Computer and Information Sciences, 2018/06/05/ 2018, doi: https://doi.org/10.1016/j.jksuci.2018.06.002.
[46] F. Raponi, R. Moscetti, D. Monarca, A. Colantoni, and R. Massantini, "Monitoring and Optimization of the Process of Drying Fruits and Vegetables Using Computer Vision: A Review," (in English), Sustainability, Review vol. 9, no. 11, p. 27, Nov 2017, Art no. 2009, doi: 10.3390/su9112009.
[47] G. Romano, M. Nagle, and J. Müller, "Two-parameter Lorentzian distribution for monitoring physical parameters of golden colored fruits during drying by application of laser light in the Vis/NIR spectrum," Innovative Food Science & Emerging Technologies, vol. 33, pp. 498-505, 2016/02/01/ 2016, doi: https://doi.org/10.1016/j.ifset.2015.11.007.
[48] J. L. Rojas-Aranda, J. I. Nunez-Varela, J. C. Cuevas-Tello, and G. Rangel-Ramirez, "Fruit Classification for Retail Stores Using Deep Learning," in Pattern Recognition, Cham, K. M. Figueroa Mora, J. Anzurez Marín, J. Cerda, J. A. Carrasco-Ochoa, J. F. Martínez-Trinidad, and J. A. Olvera-López, Eds., 2020// 2020: Springer International Publishing, pp. 3-13.
[49] Y. Osako, H. Yamane, S.-Y. Lin, P.-A. Chen, and R. Tao, "Cultivar discrimination of litchi fruit images using deep learning," Scientia Horticulturae, vol. 269, p. 109360, 2020/07/27/ 2020, doi: https://doi.org/10.1016/j.scienta.2020.109360.
[50] J.-R. Xiao, P.-C. Chung, H.-Y. Wu, Q.-H. Phan, J.-L. A. Yeh, and M. T. Hou, "Detection of Strawberry Diseases Using a Convolutional Neural Network," Plants, vol. 10, no. 1, 2021, doi: 10.3390/plants10010031.
[51] S. Dargan, M. Kumar, M. R. Ayyagari, and G. Kumar, "A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning," Archives of Computational Methods in Engineering, vol. 27, no. 4, pp. 1071-1092, 2020/09/01 2020, doi: 10.1007/s11831-019-09344-w.
[52] S. K. Behera, A. K. Rath, and P. K. Sethy, "Maturity status classification of papaya fruits based on machine learning and transfer learning approach," Information Processing in Agriculture, 2020/05/20/ 2020, doi: https://doi.org/10.1016/j.inpa.2020.05.003.
[53] M. Jogin, Mohana, M. S. Madhulika, G. D. Divya, R. K. Meghana, and S. Apoorva, "Feature Extraction using Convolution Neural Networks (CNN) and Deep Learning," in 2018 3rd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), 18-19 May 2018 2018, pp. 2319-2323, doi: 10.1109/RTEICT42901.2018.9012507.
[54] R. Kline, "Cybernetics, automata studies, and the Dartmouth conference on artificial intelligence," IEEE Annals of the History of Computing, vol. 33, no. 4, pp. 5-16, 2010.
[55] U. Gasser and V. A. F. Almeida, "A Layered Model for AI Governance," IEEE Internet Computing, vol. 21, no. 6, pp. 58-62, 2017, doi: 10.1109/MIC.2017.4180835.
[56] N. Kriegeskorte and T. Golan, "Neural network models and deep learning," Current Biology, vol. 29, no. 7, pp. R231-R236, 2019/04/01/ 2019, doi: https://doi.org/10.1016/j.cub.2019.02.034.
[57] S. Sengupta et al., "A review of deep learning with special emphasis on architectures, applications and recent trends," (in English), Knowledge-Based Syst., Review vol. 194, p. 33, Apr 2020, Art no. 105596, doi: 10.1016/j.knosys.2020.105596.
[58] Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, vol. 521, no. 7553, pp. 436-444, 2015/05/01 2015, doi: 10.1038/nature14539.
[59] A. Krizhevsky, I. Sutskever, and G. Hinton, "ImageNet Classification with Deep Convolutional Neural Networks," Neural Information Processing Systems, vol. 25, 01/01 2012, doi: 10.1145/3065386.
[60] C. Szegedy et al., "Going deeper with convolutions," in 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 7-12 June 2015 2015, pp. 1-9, doi: 10.1109/CVPR.2015.7298594.
[61] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the Inception Architecture for Computer Vision," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 27-30 June 2016 2016, pp. 2818-2826, doi: 10.1109/CVPR.2016.308.
[62] K. He, X. Zhang, S. Ren, and J. Sun, "Deep Residual Learning for Image Recognition," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 27-30 June 2016 2016, pp. 770-778, doi: 10.1109/CVPR.2016.90.
[63] G. Huang, Z. Liu, L. V. D. Maaten, and K. Q. Weinberger, "Densely Connected Convolutional Networks," in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 21-26 July 2017 2017, pp. 2261-2269, doi: 10.1109/CVPR.2017.243.
[64] F. N. Iandola, S. Han, M. W. Moskewicz, K. Ashraf, W. J. Dally, and K. Keutzer, "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 0.5 MB model size," arXiv preprint arXiv:1602.07360, 2016.
[65] V. Snow et al., "Resilience achieved via multiple compensating subsystems: The immediate impacts of COVID-19 control measures on the agri-food systems of Australia and New Zealand," Agricultural Systems, vol. 187, p. 103025, 2021/02/01/ 2021, doi: https://doi.org/10.1016/j.agsy.2020.103025.
[66] N. Morris.(2021. Access Date) "Fruit and vegetable growers` mental health declines with no end in sight to worker shortage, new data reveals." ABC NEWS. Available: https://www.abc.net.au/news/rural/2021-02-04/worker-shortage-causes-grower-mental-health-decline/13104084
[67] Y. Zhuang.(2021. Access Date) "Without Backpackers to Pick Them, Crops Rot by the Ton in Australia." The New York Times. Available: https://www.nytimes.com/2021/03/02/world/australia/agriculture-backpackers.html
[68] H. K. Wu, J. S. Wang, and Y. H. Chen, "Development of Fruit Grading System Based on Image Recognition," in 2020 IEEE 2nd International Conference on Architecture, Construction, Environment and Hydraulics (ICACEH), 25-27 Dec. 2020 2020, pp. 26-27, doi: 10.1109/ICACEH51803.2020.9366224.
[69] M. E. Karar, F. Alsunaydi, S. Albusaymi, and S. Alotaibi, "A new mobile application of agricultural pests recognition using deep learning in cloud computing system," Alexandria Engineering Journal, vol. 60, no. 5, pp. 4423-4432, 2021/10/01/ 2021, doi: https://doi.org/10.1016/j.aej.2021.03.009.
[70] H.-W. Liu, C.-H. Chen, Y.-C. Tsai, K.-W. Hsieh, and H.-T. Lin, "Identifying Images of Dead Chickens with a Chicken Removal System Integrated with a Deep Learning Algorithm," Sensors, vol. 21, no. 11, p. 3579, 2021. [Online]. Available: https://www.mdpi.com/1424-8220/21/11/3579.
[71] Y. R. Chen et al., "An AI-based System for Monitoring Behavior and Growth of Pigs," in 2020 International Computer Symposium (ICS), 17-19 Dec. 2020 2020, pp. 91-95, doi: 10.1109/ICS51289.2020.00027.
[72] I. Huang et al., "The Prototype of a Smart Underwater Surveillance System for Shrimp Farming," in 2018 IEEE International Conference on Advanced Manufacturing (ICAM), 16-18 Nov. 2018 2018, pp. 177-180, doi: 10.1109/AMCON.2018.8614976.
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU202101406en_US