Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/35271
DC FieldValueLanguage
dc.contributor.advisor蔡瑞煌zh_TW
dc.contributor.advisorTsaih, Rayen_US
dc.contributor.author林志忠zh_TW
dc.contributor.authorLin, Chih-chungen_US
dc.creator林志忠zh_TW
dc.creatorLin, Chih-chungen_US
dc.date2004en_US
dc.date.accessioned2009-09-18T06:36:15Z-
dc.date.available2009-09-18T06:36:15Z-
dc.date.issued2009-09-18T06:36:15Z-
dc.identifierG0923560141en_US
dc.identifier.urihttps://nccur.lib.nccu.edu.tw/handle/140.119/35271-
dc.description碩士zh_TW
dc.description國立政治大學zh_TW
dc.description資訊管理研究所zh_TW
dc.description92356014zh_TW
dc.description93zh_TW
dc.description.abstract對於神經網路系統將提出一個法則萃取的方式,並從神經網路中得到相關法則。在這裡我們所提到的方法是根據反函數的觀念而得到的。zh_TW
dc.description.abstractA rule-extraction method of the layered feed-forward neural networks is proposed here for identifying the rules suggested in the network. The method that we propose for the trained layered feed-forward neural network is based on the inversion of the functions computed by each layer of the network. The new rule-extraction method back-propagates regions from the output layer back to the input layer, and we hope that the method can be used further to deal with the predicament of ANN being a black box.en_US
dc.description.tableofcontents1. Introduction …………………………………………………………………………1\n2. Literature Review…………………………………………………………………...3\n2.1 A Mathematical Study of the Layered Feed-forward Neural Networks……...3\n2.2 The Rule Extraction from Multi-layer Feed-forward Neural Networks……..6\n2.3 Discussion……………………………………………………………………7\n3. The Definition and Representation of Polyhedra…………………………………...9\n4. The Definition and Representation of Feed-forward Neural Networks…………...10\n5. The rule-extraction of a 3-layer feed-forward approximation network…………...12\n5.1 The back-propagation with respect to the linear output transformation sub-process……………………………………………………………………...12\n5.2 The back-propagation with respect to the approximation function sub-process……………………………………………………………………...12\n5.3 The back-propagation with respect to the affine net transformation sub-process……………………………………………………………………...13\n5.4 An illustration of the rule-extraction………………………………………..13\n6. The rule-extraction of a 3-layer feed-forward neural network…………………….21\n6.1 The back-propagation with respect to the linear output transformation sub-process……………………………………………………………………...21\n6.2 The back-propagation with respect to the transfer function sub-process…...21\n6.3 The back-propagation with respect to the affine net transformation sub-process……………………………………………………………………...22\n 6.4 An illustration of the rule-extraction .................................................22\n7. The Illustration…………………………………………………………………….26\n7.1 Definition…………………………………………………………………...26\n7.2 The Network I………………………………………………………………26\n7.2.1 The rule-extraction of the Network I………………………………...26\n7.2.2 The rule-extraction of the approximation Network I………………..28\n7.3 The Network II……………………………………………………………...29\n7.3.1 The rule-extraction of the Network II……………………………….29\n7.3.2 The rule-extraction of the approximation Network II……………….32\n7.4 The Network III……………………………………………………………..34\n7.4.1 The rule-extraction of the Network III………………………………34\n7.4.2 The rule-extraction of the approximation Network III………………36\n8. The Discussion of Error Ratio and The Future Work……………………………...38\n8.1 Definition…………………………………………………………………...38\n8.2 The Definition of Evaluation Mechanism …………………………………..38\n8.3 The Discussion of Network I………………………………………………..38\n8.3.1 The Discussion of ER1 in Network I………………………………..39\n8.3.2 The Discussion of ER2 in Network I………………………………..40\n8.4 The Discussion of Network II………………………………………………41\n8.4.1 The Discussion of ER1 in Network II……………………………….42\n8.4.2 The Discussion of ER2 in Network II……………………………….45\n8.5 The Discussion of Network III……………………………………………...49\n8.5.1 The Discussion of ER1 in Network III………………………………49\n8.5.2 The Discussion of ER2 in Network III………………………………51\n8.6 The Future Work…………………………………………………………….53\n\nFigure 1: The feed-forward neural network with one hidden layer and one output node………………………………..…………………………………………………..3\nFigure 2: The feed-forward neural network with one hidden layer and one output node……………………………………………………………………………………6\nFigure 3: The framework of feed-forward neural network…………………………..10\nFigure 6: The observation of f-1(-0.5) in Network I…………………………………27\nFigure 7: The observation of Xa(-0.5) in the approximation Network I……………..28\nFigure 9: The observation of Network II……………………………………………..31\nFigure 10: The observation of approximation Network II…………………………...33\nFigure 11: The observation of f-1(y) in Network III…………………………………35\nFigure 12: The observation of Xa( ) in Network III………………………………...37\nFigure 13: The observation of Xa(-0.5) and f-1(-0.5) in Network I………………….39\nFigure 14: The difference of y and ya in Network I………………………………….39\nFigure 15: The observation of ER1 in Network I…………………………………….40\nFigure 16: The difference of and in Network I……………………………40\nFigure 17: The observation of ER2 in Network I…………………………………….41\nFigure 18: The observation of Xa(-1.29) and f-1(-1.29) in Network II. ……………..41\nFigure 19: The difference of y and ya in Network II………………………………...42\nFigure 20: The observation of ER1 in Network II…………………………………...42\nFigure 21: The difference of y and ya in Network II………………………………...43\nFigure 22: The observation of ER1 in Network II. …………………………………..43\nFigure 23: The difference of y and ya in Network II………………………………...44\nFigure 24: The observation of ER1 in Network II…………………………………...44\nFigure 25: The difference of and in Network II……………………………45\nFigure 26: The observation of ER2 in Network II. …………………………………..45\nFigure 27: The difference of and in Network II……………………………46\nFigure 28: The observation of ER2 in Network II…………………………………...46\nFigure 29: The difference of and in Network II……………………………47\nFigure 30: The observation of ER2 in Network II…………………………………...47\nFigure 31: The difference of and in Network II……………………………48\nFigure 32: The observation of ER2 in Network II. …………………………………..48\nFigure 33: The observation of Xa(y) and f-1(y) in Network III……………………...49\nFigure 34: The difference of y and ya in Network III………………………………..50\nFigure 35: The observation of ER1 in Network III…………………………………..50\nFigure 36: The difference of and in Network III…………………………...51\nFigure 37: The observation of ER2 in Network III…………………………………..51\nFigure 38: The difference of and in Network III…………………………...52\nFigure 39: The observation of ER2 in Network III…………………………………..52zh_TW
dc.format.extent43419 bytes-
dc.format.extent200561 bytes-
dc.format.extent25327 bytes-
dc.format.extent38352 bytes-
dc.format.extent26846 bytes-
dc.format.extent140054 bytes-
dc.format.extent21584 bytes-
dc.format.extent32878 bytes-
dc.format.extent99774 bytes-
dc.format.extent119180 bytes-
dc.format.extent216500 bytes-
dc.format.extent193505 bytes-
dc.format.extent31937 bytes-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/pdf-
dc.language.isoen_US-
dc.source.urihttp://thesis.lib.nccu.edu.tw/record/#G0923560141en_US
dc.subject類神經網路zh_TW
dc.subject法則萃取zh_TW
dc.subject反函數zh_TW
dc.subjectneural networksen_US
dc.subjectrule-extractionen_US
dc.subjectinversion functionen_US
dc.titleA Mathematical Study of the Rule Extraction of a 3-layered Feed-forward Neural Networkszh_TW
dc.typethesisen
dc.relation.reference[1] Andrews, R., Diederich, J., and Tickle, A. (1995). “A survey and critique of techniques for extracting rules from trained artificial neural networks.” Knowledge-Based System, Vol. 8, Issue 6, pp. 373 -389.zh_TW
dc.relation.reference[2] Zhou, R. R., Chen, S. F., and Chen, Z. Q. (2000). “A statistics based approach for extracting priority rules from trained neural networks.” In: Proceedings of the IEEE-INNS-ENNS International Join Conference on Neural Network, Como, Italy, Vol. 3, pp. 401 -406.zh_TW
dc.relation.reference[3] Thrun, S. B., and Linden, A. (1990). “Inversion in time.” In: Proceedings of the EURASIP Workshop on Neural Networks, Sesimbra, Portugal.zh_TW
dc.relation.reference[4] Ke, W. C. (2003). “The Rule Extraction from Multi-layer Feed-forward Neural Networks.” Taiwan: National Chengchi University.zh_TW
dc.relation.reference[5] Tsaih, R., and Lin, C. C. (2004). “The Layered Feed-Forward Neural Networks and Its Rule Extraction.” In: Proceeding of ISNN 2004 International Symposium on Neural Networks, Dalian, China, pp. 377 -382.zh_TW
dc.relation.reference[6] Rumelhart, D. E., Hinton, G. E., and Williams, R., “Learning internal representation by error propagation,” in Parallel Distributed Processing, vol. 1, Cambridge, MA: MIT Press, 1986, pp. 318-362.zh_TW
dc.relation.reference[7] Tsaih, R. (1998). “An Explanation of Reasoning Neural Networks.” Mathematical and Computer Modeling, vol. 28, pp. 37 -44.zh_TW
dc.relation.reference[8] Maire, F. (1999). “Rule-extraction by backpropagation of polyhedra.” Neural Networks, vol. 12, pp. 717 -725.zh_TW
item.languageiso639-1en_US-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_46ec-
item.openairetypethesis-
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:學位論文
Files in This Item:
File Description SizeFormat
56014101.pdf42.4 kBAdobe PDF2View/Open
56014102.pdf195.86 kBAdobe PDF2View/Open
56014103.pdf24.73 kBAdobe PDF2View/Open
56014104.pdf37.45 kBAdobe PDF2View/Open
56014105.pdf26.22 kBAdobe PDF2View/Open
56014106.pdf136.77 kBAdobe PDF2View/Open
56014107.pdf21.08 kBAdobe PDF2View/Open
56014108.pdf32.11 kBAdobe PDF2View/Open
56014109.pdf97.44 kBAdobe PDF2View/Open
56014110.pdf116.39 kBAdobe PDF2View/Open
56014111.pdf211.43 kBAdobe PDF2View/Open
56014112.pdf188.97 kBAdobe PDF2View/Open
56014113.pdf31.19 kBAdobe PDF2View/Open
Show simple item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.