簡易檢索 / 詳目顯示

研究生: 蔡豐聲
Feng-Sheng Tsai
論文名稱: 定址記憶問題
The Content-Addressable Memory Problem
指導教授: 施茂祥
Shih, Mau-Hsiang
學位類別: 碩士
Master
系所名稱: 數學系
Department of Mathematics
論文出版年: 2003
畢業學年度: 91
語文別: 英文
論文頁數: 42
中文關鍵詞: 神經網路遞迴神經網路突現集星狀凸集堆球問題Hebb的增強學習規則CAM演算法生成元定址記憶
英文關鍵詞: neural network, recursive network, emergent set, a Hamming star-convexity packing, Hebb's strengthened learning rule, CAM algorithm, generator, Content-Addressable
論文種類: 學術論文
相關次數: 點閱:233下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 我們針對神經網路的一個根本問題提出解答 :
    "是否存在一個遞迴神經網路模擬人腦記憶儲存功能?"
    我們所提出的解答,主要以"突現集","星狀凸集堆球問題","Hebb的增強學習規則","CAM演算法"為其中心思想,並證明了臨界網路的穩定平衡狀態是由01-生成突現集所構成.據此,我們提出了生成元的概念,
    並藉由生成元來建構臨界網路,導出一組記憶儲存功能的機制.

    Abstract. We propose a solution to a fundamental problem in neural nets : " Stored an arbitrary set of fundamental memories, does there exist a recursive network for which these fundamental memories are stable equilibrium states of the network ? " The heart of it is the conception of the emergent set, a Hamming star-convexity packing in the n-cube, the mathematical framework of Hebb's strengthened learning rule, and the CAM algorithm. We prove that the set of stable equilibrium states of the threshold network constructed by Hebb's strengthened learning rule that responds to incoming signals of the states of fundamental
    memories is the 01-span of the emergence of fundamental memories. On this basis, we reduce the question to a problem for constructing a threshold network with sparse connections that responds to incoming signals of the states of a generator of fundamental memories, and thereby
    probing the collective dynamics of the network. One of the great intellectual challenges is to nd the mechanism for storage of memory. The solution of the Content-Addressable Memory Problem indicates a mechanism for storage of memory that a network produced in the brains by sucking the kernel of the received stored memory items as incoming signals can correctly yield the entire memory items on the basis of sucient partial information by the chaotic dynamics with a regular strategy-set.

    1.Introduction ........................................2 2. The emergent set ...................................3 3. The Hamming star-convexity packing problem .........7 4. The mathematical framework of Hebb's strengthened learning rule .........................................12 5. The content-addressable memory algorithm ...........24 6. The threshold X-network as pattern of recognition ..33 7. Comparison between Hop eld network and threshold X- network ............................................37 8. Concluding remarks .................................40 References ............................................41

    [1] E. K. Blum, Mathematical aspects of outer-product asynchronous content-addressable memories, Biological Cybernetics, 62 (1990), 337-348.
    [2] M. Cottrell, Stability and attractivity in associative memory networks, Biological Cybernetics, 58 (1988), 129-139.
    [3] E. Goles, Comportement dynamique de reseaux d'automates, Dissertation, Grenoble,1985.
    [4] E. Goles and S. Martinez, Neural and Automata Networks, Dynamical Behavior and Applications, Kluwer Academic Publishers, Dordrecht, 1991.
    [5] M. Grotschel, L. Lovasz, and A. Schrijver, Geometric Algorithms and Combinational Optimization, Springer-Verlag, Berlin-Heidelberg-New York, 1998.
    [6] S. Haykin, Neural Networks, Prentice-Hall, Inc., 1999.
    [7] D. O. Hebb, The Organization of Behavior : A Neuropsychological Theory, John Wiley & Sons, Inc., New York, 1949.
    [8] J. Hertz, A. Krogh, R. G. Palmer, Introduction to the Theory of Neural Computation, Santa Fe Institute, Studies in the Sciences of Complexity, Lecture Notes Vol I, Addison-
    Wesley Publishing Company, New York, 1991.
    [9] A. L. Hodgkin, A. F. Huxley, A quantitative description of membrane current and its application to conduction and excitation in nerve, Journal of Physiology, 117 (1952),
    500-544.
    [10] J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. USA, 79 (1982), 2554-2558. 41
    [11] J. J. Hopfield and D. W. Tank, `Neural' computation of decision in optimization problems, Biological Cybernetics, 52 (1985), 141-152.
    [12] Y. Kamp and M. Hasler, Recursive Neural Networks for Associative Memory, John Wiley & Sons, Inc., New York, 1990.
    [13] W. S. McCulloch and W. H. Pitts, A logical calculus of the ideas immanent in neural nets, Bulletin of Mathematical Biophysics, 5 (1943), 115-133.
    [14] P. Peretto, Properties of neural networks, a statistical physics approach, Biological Cybernetics, 50 (1984), 31-62.
    [15] F. Robert, Les Systemes Dynamiques Discrets, Mathematiques & Applications, Springer-Verlag, Berlin-Heidelberg-New York, 1995.
    [16] F. F. Soulie, G. Weisbuch, Random iterations of threshold network and associative memory, SIAM J. Computing, 16 (1987), 203-220.
    [17] G. S. Stent, A physiological mechanism for Hebb's postulate of learning, Proc. Natl. Acad. Sci. USA, 70 (1973), 997-1001.
    [18] G. Weisbuch, Complex System Dynamics : An Introduction to Automata Networks, Santa Fe Institute, Studies in the Sciences of Complexity, Lecture Notes, Vol II, Addison-Wesley Publishing Company, New York, 1991.
    [19] C. N. Yang, Introductory note on phase transitions and critical phenomena, In Phase Transitions and Critical Phenomena, Vol I, C. Domb and M. S. Grean, eds. London :
    Academic Press, 1971, 1-5.

    下載圖示
    QR CODE