Please use this identifier to cite or link to this item:
https://ah.lib.nccu.edu.tw/handle/140.119/35271
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 蔡瑞煌 | zh_TW |
dc.contributor.advisor | Tsaih, Ray | en_US |
dc.contributor.author | 林志忠 | zh_TW |
dc.contributor.author | Lin, Chih-chung | en_US |
dc.creator | 林志忠 | zh_TW |
dc.creator | Lin, Chih-chung | en_US |
dc.date | 2004 | en_US |
dc.date.accessioned | 2009-09-18T06:36:15Z | - |
dc.date.available | 2009-09-18T06:36:15Z | - |
dc.date.issued | 2009-09-18T06:36:15Z | - |
dc.identifier | G0923560141 | en_US |
dc.identifier.uri | https://nccur.lib.nccu.edu.tw/handle/140.119/35271 | - |
dc.description | 碩士 | zh_TW |
dc.description | 國立政治大學 | zh_TW |
dc.description | 資訊管理研究所 | zh_TW |
dc.description | 92356014 | zh_TW |
dc.description | 93 | zh_TW |
dc.description.abstract | 對於神經網路系統將提出一個法則萃取的方式,並從神經網路中得到相關法則。在這裡我們所提到的方法是根據反函數的觀念而得到的。 | zh_TW |
dc.description.abstract | A rule-extraction method of the layered feed-forward neural networks is proposed here for identifying the rules suggested in the network. The method that we propose for the trained layered feed-forward neural network is based on the inversion of the functions computed by each layer of the network. The new rule-extraction method back-propagates regions from the output layer back to the input layer, and we hope that the method can be used further to deal with the predicament of ANN being a black box. | en_US |
dc.description.tableofcontents | 1. Introduction …………………………………………………………………………1\n2. Literature Review…………………………………………………………………...3\n2.1 A Mathematical Study of the Layered Feed-forward Neural Networks……...3\n2.2 The Rule Extraction from Multi-layer Feed-forward Neural Networks……..6\n2.3 Discussion……………………………………………………………………7\n3. The Definition and Representation of Polyhedra…………………………………...9\n4. The Definition and Representation of Feed-forward Neural Networks…………...10\n5. The rule-extraction of a 3-layer feed-forward approximation network…………...12\n5.1 The back-propagation with respect to the linear output transformation sub-process……………………………………………………………………...12\n5.2 The back-propagation with respect to the approximation function sub-process……………………………………………………………………...12\n5.3 The back-propagation with respect to the affine net transformation sub-process……………………………………………………………………...13\n5.4 An illustration of the rule-extraction………………………………………..13\n6. The rule-extraction of a 3-layer feed-forward neural network…………………….21\n6.1 The back-propagation with respect to the linear output transformation sub-process……………………………………………………………………...21\n6.2 The back-propagation with respect to the transfer function sub-process…...21\n6.3 The back-propagation with respect to the affine net transformation sub-process……………………………………………………………………...22\n 6.4 An illustration of the rule-extraction .................................................22\n7. The Illustration…………………………………………………………………….26\n7.1 Definition…………………………………………………………………...26\n7.2 The Network I………………………………………………………………26\n7.2.1 The rule-extraction of the Network I………………………………...26\n7.2.2 The rule-extraction of the approximation Network I………………..28\n7.3 The Network II……………………………………………………………...29\n7.3.1 The rule-extraction of the Network II……………………………….29\n7.3.2 The rule-extraction of the approximation Network II……………….32\n7.4 The Network III……………………………………………………………..34\n7.4.1 The rule-extraction of the Network III………………………………34\n7.4.2 The rule-extraction of the approximation Network III………………36\n8. The Discussion of Error Ratio and The Future Work……………………………...38\n8.1 Definition…………………………………………………………………...38\n8.2 The Definition of Evaluation Mechanism …………………………………..38\n8.3 The Discussion of Network I………………………………………………..38\n8.3.1 The Discussion of ER1 in Network I………………………………..39\n8.3.2 The Discussion of ER2 in Network I………………………………..40\n8.4 The Discussion of Network II………………………………………………41\n8.4.1 The Discussion of ER1 in Network II……………………………….42\n8.4.2 The Discussion of ER2 in Network II……………………………….45\n8.5 The Discussion of Network III……………………………………………...49\n8.5.1 The Discussion of ER1 in Network III………………………………49\n8.5.2 The Discussion of ER2 in Network III………………………………51\n8.6 The Future Work…………………………………………………………….53\n\nFigure 1: The feed-forward neural network with one hidden layer and one output node………………………………..…………………………………………………..3\nFigure 2: The feed-forward neural network with one hidden layer and one output node……………………………………………………………………………………6\nFigure 3: The framework of feed-forward neural network…………………………..10\nFigure 6: The observation of f-1(-0.5) in Network I…………………………………27\nFigure 7: The observation of Xa(-0.5) in the approximation Network I……………..28\nFigure 9: The observation of Network II……………………………………………..31\nFigure 10: The observation of approximation Network II…………………………...33\nFigure 11: The observation of f-1(y) in Network III…………………………………35\nFigure 12: The observation of Xa( ) in Network III………………………………...37\nFigure 13: The observation of Xa(-0.5) and f-1(-0.5) in Network I………………….39\nFigure 14: The difference of y and ya in Network I………………………………….39\nFigure 15: The observation of ER1 in Network I…………………………………….40\nFigure 16: The difference of and in Network I……………………………40\nFigure 17: The observation of ER2 in Network I…………………………………….41\nFigure 18: The observation of Xa(-1.29) and f-1(-1.29) in Network II. ……………..41\nFigure 19: The difference of y and ya in Network II………………………………...42\nFigure 20: The observation of ER1 in Network II…………………………………...42\nFigure 21: The difference of y and ya in Network II………………………………...43\nFigure 22: The observation of ER1 in Network II. …………………………………..43\nFigure 23: The difference of y and ya in Network II………………………………...44\nFigure 24: The observation of ER1 in Network II…………………………………...44\nFigure 25: The difference of and in Network II……………………………45\nFigure 26: The observation of ER2 in Network II. …………………………………..45\nFigure 27: The difference of and in Network II……………………………46\nFigure 28: The observation of ER2 in Network II…………………………………...46\nFigure 29: The difference of and in Network II……………………………47\nFigure 30: The observation of ER2 in Network II…………………………………...47\nFigure 31: The difference of and in Network II……………………………48\nFigure 32: The observation of ER2 in Network II. …………………………………..48\nFigure 33: The observation of Xa(y) and f-1(y) in Network III……………………...49\nFigure 34: The difference of y and ya in Network III………………………………..50\nFigure 35: The observation of ER1 in Network III…………………………………..50\nFigure 36: The difference of and in Network III…………………………...51\nFigure 37: The observation of ER2 in Network III…………………………………..51\nFigure 38: The difference of and in Network III…………………………...52\nFigure 39: The observation of ER2 in Network III…………………………………..52 | zh_TW |
dc.format.extent | 43419 bytes | - |
dc.format.extent | 200561 bytes | - |
dc.format.extent | 25327 bytes | - |
dc.format.extent | 38352 bytes | - |
dc.format.extent | 26846 bytes | - |
dc.format.extent | 140054 bytes | - |
dc.format.extent | 21584 bytes | - |
dc.format.extent | 32878 bytes | - |
dc.format.extent | 99774 bytes | - |
dc.format.extent | 119180 bytes | - |
dc.format.extent | 216500 bytes | - |
dc.format.extent | 193505 bytes | - |
dc.format.extent | 31937 bytes | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.format.mimetype | application/pdf | - |
dc.language.iso | en_US | - |
dc.source.uri | http://thesis.lib.nccu.edu.tw/record/#G0923560141 | en_US |
dc.subject | 類神經網路 | zh_TW |
dc.subject | 法則萃取 | zh_TW |
dc.subject | 反函數 | zh_TW |
dc.subject | neural networks | en_US |
dc.subject | rule-extraction | en_US |
dc.subject | inversion function | en_US |
dc.title | A Mathematical Study of the Rule Extraction of a 3-layered Feed-forward Neural Networks | zh_TW |
dc.type | thesis | en |
dc.relation.reference | [1] Andrews, R., Diederich, J., and Tickle, A. (1995). “A survey and critique of techniques for extracting rules from trained artificial neural networks.” Knowledge-Based System, Vol. 8, Issue 6, pp. 373 -389. | zh_TW |
dc.relation.reference | [2] Zhou, R. R., Chen, S. F., and Chen, Z. Q. (2000). “A statistics based approach for extracting priority rules from trained neural networks.” In: Proceedings of the IEEE-INNS-ENNS International Join Conference on Neural Network, Como, Italy, Vol. 3, pp. 401 -406. | zh_TW |
dc.relation.reference | [3] Thrun, S. B., and Linden, A. (1990). “Inversion in time.” In: Proceedings of the EURASIP Workshop on Neural Networks, Sesimbra, Portugal. | zh_TW |
dc.relation.reference | [4] Ke, W. C. (2003). “The Rule Extraction from Multi-layer Feed-forward Neural Networks.” Taiwan: National Chengchi University. | zh_TW |
dc.relation.reference | [5] Tsaih, R., and Lin, C. C. (2004). “The Layered Feed-Forward Neural Networks and Its Rule Extraction.” In: Proceeding of ISNN 2004 International Symposium on Neural Networks, Dalian, China, pp. 377 -382. | zh_TW |
dc.relation.reference | [6] Rumelhart, D. E., Hinton, G. E., and Williams, R., “Learning internal representation by error propagation,” in Parallel Distributed Processing, vol. 1, Cambridge, MA: MIT Press, 1986, pp. 318-362. | zh_TW |
dc.relation.reference | [7] Tsaih, R. (1998). “An Explanation of Reasoning Neural Networks.” Mathematical and Computer Modeling, vol. 28, pp. 37 -44. | zh_TW |
dc.relation.reference | [8] Maire, F. (1999). “Rule-extraction by backpropagation of polyhedra.” Neural Networks, vol. 12, pp. 717 -725. | zh_TW |
item.languageiso639-1 | en_US | - |
item.cerifentitytype | Publications | - |
item.openairecristype | http://purl.org/coar/resource_type/c_46ec | - |
item.openairetype | thesis | - |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
Appears in Collections: | 學位論文 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
56014101.pdf | 42.4 kB | Adobe PDF2 | View/Open | |
56014102.pdf | 195.86 kB | Adobe PDF2 | View/Open | |
56014103.pdf | 24.73 kB | Adobe PDF2 | View/Open | |
56014104.pdf | 37.45 kB | Adobe PDF2 | View/Open | |
56014105.pdf | 26.22 kB | Adobe PDF2 | View/Open | |
56014106.pdf | 136.77 kB | Adobe PDF2 | View/Open | |
56014107.pdf | 21.08 kB | Adobe PDF2 | View/Open | |
56014108.pdf | 32.11 kB | Adobe PDF2 | View/Open | |
56014109.pdf | 97.44 kB | Adobe PDF2 | View/Open | |
56014110.pdf | 116.39 kB | Adobe PDF2 | View/Open | |
56014111.pdf | 211.43 kB | Adobe PDF2 | View/Open | |
56014112.pdf | 188.97 kB | Adobe PDF2 | View/Open | |
56014113.pdf | 31.19 kB | Adobe PDF2 | View/Open |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.