跳到主要內容

簡易檢索 / 詳目顯示

研究生: 徐志鈞
Hsu Chih Chun
論文名稱: 推理類神經網路及其應用
The Reasoning Neural Network and It's Applications
指導教授: 蔡瑞煌
Tsaih
學位類別: 碩士
Master
系所名稱: 商學院 - 資訊管理學系
Department of Management Information System
論文出版年: 1994
畢業學年度: 82
語文別: 英文
論文頁數: 49
中文關鍵詞: 推理類神經網路軟性學習程序線性分割條件不相關節點推理機能
外文關鍵詞: The Reasoning Neural Network, the softening learning algorithm, linearly separating condition
相關次數: 點閱:127下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 大部的類神經網路均為解決特定問題而設計,並非真正去模擬人腦的功能

    ,在本論文中介紹一個模擬人類學習方式的類神經網路,稱為推理類神經

    網路(The Reasoning Neural Network),其主要兩個組成為強記(

    cram -ming)及推理(reasoning)部份,透過彈性的組合這兩個部份可

    使類神經網路具有類似人類的學習程序。在本論文中介紹其中一個學習程

    序並用四個實驗來評估推理類神經網路的績效,從實結果得知,推理類神

    經網路能以合理的隱藏節點數(hidden nodes)達到學習的目標,並建立

    一個網路內部表示方式(internal representation),及具有好的推理

    能力(g eneralization ability)。


    Most of artification Neural Networks are designed to resolve

    spe -cific problems, rather than to model the brain. The

    Reasoning N -eural Network (RNN) that imitates the way of human

    learning is presented here. Two key components of RNN are the

    cramming and t -he reasoning. These components coulds be

    arranged flexibly to a -chieve the human-like learning

    procedure. One edition of the RNN used in experiments is

    introduces, and four different proble -ms are used to evaluate

    the RNN's performance. From simulation results, the RNN

    accomplishes the goal of learning with a reason -able number of

    hidden nodes, and evolves a good internal repres -entation and

    a generalization ability.

    Contents
    1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    2 Literature Review
    2.1. The Back Propagation Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
    2.1.1 Network Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
    2.1.2 The Back Propagation Learning algorithm . . . . . . . . . . . . . . . . . . . . . . . . . .4
    2.1.3 The Update Rule in The Back Propagation Learning algorithm . . . . . . . . . 7
    2.1.4 Local Minimum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
    2.2 Node Pruning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
    2.2.1 Sensitivity Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
    2.2.2 Penalty Term and Weight Decay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13
    2.2.3 Other pruning algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15
    2.3 Softening Learning Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
    2.3.1 Network Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18
    2.3.2 Two Classes Categorization Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19
    2.3.3 The Learning Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
    2.3.4 Cramming Part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
    2.3.5 Reasoning Part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .22

    3 The Reasoning Neural Network
    3.1 The RNN Network Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
    3.2 Prime Parts of the RNN Learning Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
    3.3 Explanation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
    3.4 One Edition of the RNN Learning Procedure . . . . . . . . . . . . . . . . . . . . . .. . . . . 29

    4 Experiments
    4.1 Parity problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33
    4.2 Output-hidden problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .35
    4.3 Encoder Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
    4.4 Chinese character recognition problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .40

    5 Discussions and Future works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .45

    Bibliography 48

    Bibliography
    中文部份
    [葉怡成93]葉怡成,類神經網路模式應用與實作,儒林圖書公司,民國八十三年.
    英文部份

    [Arai89] Arai M., "Mapping abilities of three-Layer Neural Networks, ", In
    Proceedings of the International Conference on Neural Networks
    (Washington, DC:IEEE), pp I:419-424, 1989.
    [Arai93] Arai M., "Bounds on the Number of Hidden Nodes in Binary-Valued
    Three-Layer Neural Networks,", In IEEE Transactions on Neural
    Networks vol 6, pp 855-860, 1993.
    [Chau89] Y. Chauvin, "A back-propagation algorithm with optimal use of hidden
    node," In Advance in Neural Information Processing (2), D.S . Toutetzky,
    Ed. (Denver 1988), 1989,pp. 519-526.
    [Cybe89] G. Cybenko, "Approximation by superpositions of a sigmoidal function,"
    Mathematics of Control, Signals, and Systems, 2(4): pp. 303-314, 1989.
    [Cast93] Castellano G., Fanelli A. M., Pelillo M. ,"An Empirical Comparsion of
    Node Pruning Methods for Layerd Feed-forward Neural Networks", In
    Proceedings of 1993 International Joint Conference on Neural Networks,
    pp 321-326, 1993.
    [Cun90] Y. Ie Cun, 1.s. Denker, and S.A. Solla, "Optimal brain damage." In D.
    Tourezky, editor, Advances in Neural Information ProceSSing System 2, pp
    598-605. Morgan Kaufmann, 1990.
    [Free92] Freeman 1. A. , Skapura D. M.,"Neural Networks Algorithm, Applications,
    and Programming Techniques", Addison-Wesley, Reading MA, 1992.
    [Hertz91] J. Hertz, A. Krogh, R. Palmer, "Introduction to the theory of neural
    computation", Addison-Wesley, Reading MA, 1991.
    [Hagi93] Hagiwara M.,"Removal of hidden nodes and Weights for Back
    Propagation Networks", In Proceedings of 1993 International Joint
    Conference on Neural Networks, pp 35 1-353,1993.
    [Hajn87] A Hajnal, W. Maass, P.Pudlak, M. Szegedy, and G. Turan, "Threshold
    circuits of bounded depth." In Proceedings of the J 987 IEEE Symposium
    on the Funciations of Science, pp. 99-110,1987.
    [Hint87] Geoffrey E. Hinton and Terrence J. Sejnomsk.i. "Neural network
    architectures for AI" Tutolial No. MJ?2, AAA187, Seattle, W A, July 1987.
    [Huan91] Huang, S. c., & Huang, Y. F. "Bounds on the number of hidden neurons in
    multilayer perceptrons,", In IEEE Transactions on Neural Networks Vol 2,
    pp 47-55, 1991.
    [Hush93 J Don R. Hush and Bill G. Horne, "Progess in Supervised Neural Networks:
    What's New Since Lippmann?", In IEEE SIGNAL ProceSSing magazine, pp
    8-39, January 1993.
    [Ishi90] M. Ishikawa, "A structural learning algorithm with forgetting of link
    weights," Tech. Rep. TR-90-7, Electroteclmical Lab., Tsukuba-City, Japan,
    1990.
    [Jac088] Adaptation." Neural Networks, Vol 1, pp 295-307.
    [Ji90] C. Ji, R.R. Snapp, and D. Psaltis, "Gereralizing smoothness constraints
    from discrete samples," Neural Commputation, Vol. 2, no. 2, pp. 188-197,
    1990.
    [Kam90] E . D. Karnin,"A simple procedure for pruning back-propagation trained
    neural network," IEEE Trans. Neural Networks, Vol. 2, no. 2, pp. 239-242,
    1990.
    [Krus88] Kruschke, J, "Creating local and distributed Bottlenecks in Hidden Layers
    of Back pop gat ion Networks," in Touretzky, D., Hiton, G., and Sejnowski,
    T (Eds): Proceeding oj the Connectionist Models Summer School 1988,
    Morgan Kaufman Pub!., San Mateo, California, 1988.
    [Makh89] 1. Makhoul, A. EI-Jaroudi, and R. Schwartz, " Fonnation of disconnected
    decision regions with a single hidden layer." In Proceedings oj the
    International Joint Conference on Neural Networks. Vol 1, pp 455-460,
    1989.
    [Merz88] Merzenich, M.M., Recanzone, G., Jenkins, W.M., Allard, TT, and Nudo,
    R)., "Cortical Representional Plasticity," In Rakie, P. and Singer, W.
    (Eds) : Neurobiology of Neocortex. John Wiley and Sons Limited. S.
    Bernhard. Dahlem Konferenzen, 1988.
    [McIn89] McInerny, J.M., K.G. Hainer, S. Biafore, and R. Hecht-Nielsen (1989).
    "Back Propagation Error Surfaces Can Have Local Minima.", In
    International Joint Conference on Neural Networks (Washington 1989),
    vol. II, 627. New York:IEEE.
    [Mins69] Minsky, M.L. and S.A. Papert, "Perceptron," Cambridge: MIT Press.
    Partially reprinted in Anderson and Rosenberg, 1988.
    [Morg91J D.P. Morgan and c.L. Scofield, Neural Networks and Speech Processing.
    Kluwer Academic Publishers, 1991.
    [Moze89] M. C. Mozer and P. Smolensky, "Skeletonization: A technique for trimming
    the fat from a network via relevance assessment,", In Advances in Neural
    Information Processing (1), D.S. Touretzky, Ed. (Denver 1988), 1989, pp.
    107-115.
    [Mura91] K.Murase, YMatsunaga and Y.Nakade: "A Back-propagation Algorithm
    with Automatically Determines the Number of Association Units", Proc. of
    International Joint Conference on Neural Networks(IJCNN-SINGAPORE),
    I, p.783-788, 1991.
    [pe1i93J M. Pelillo and A. M. Fanelli, "A method of pruning layered feed-forward
    neural networks," in Proc IWANN'93, Sitges, Barcelona, June 1993
    (Berlin: Springer-Verlag).
    [Reed93] R. Reed, "Pruning A1goritluns-A Survey", in IEEE Transactions On
    Neural Network, Vo14, NO.5, September 1993, pp. 740-747.
    [Rume86] Rumelhart, D.E., James McClelland, "Parallel Distributed Processing, ", Vol
    1 and 2, MIT Press, Cambridge, MA, 1986.
    [Siet88] Sietsma., 1. and R.lF. Dow (1988), "Neural Net Pruning-Why and How."
    In IEEE International Conference on Neural Networks (San Diego 1988),
    Vol. 1,325-333. New York:IEEE.
    [Tsai93] Tsaih, R., "The Softening Learning Procedure." In Mathematical and
    Computer Modeling, Vol. 18, No.8, 61-64.
    [Tsai93] Tsaih, R., "The Softening Learning Procedure for The Layered Feedforward
    Networks With Multiple Output nodes". Proceeding of
    Intemational Joint Conference on Neural Networks, Nagoya, I, 593-596.
    [Wata93] Watanabe E. and Shimizu H., "Algorithm for Pruning Hidden Nodes in
    Multi Layerd Neural Network for binary StimulusClassification Problem",
    In Proceedings of 1993 Intemational Joint Conference on Neural
    Networks, pp 327-330, 1993 .
    [Weig91] A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, "Generalization by
    weight-elimination with application to forcasting," in Advances in Neural
    Information Processing (3), R. Lippmann, J. Moody, and D. Touretzky,
    Eds, 1991, pp. 875-882.
    [Werb90] P. J. Werbos, "Backpropagation Through Time: What It Does and How to
    Do It", In Proceedings of the IEEE, Vol 78, No.1 0, October 1990, pp .
    1550-1560.
    [Werb74] P. Werbos. IIBeyond Regression :New Tools for Prediction and the Analysis
    in the Behavioral Sciences." Ph.D. thesis, Harvard, Cambridge, MA,
    August 1974.

    無法下載圖示 (限達賢圖書館四樓資訊教室A單機使用)
    QR CODE
    :::