Recurrence Methods in the Analysis of Learning Processes

Article Properties
  • Language
    English
  • Publication Date
    2001/08/01
  • Indian UGC (Journal)
  • Refrences
    3
  • S. Mendelson Department of Mathematics, Technion, and Institute of Computer Science, Hebrew University, Jerusalem 91120, Israel
  • I. Nelken Department of Physiology, Hebrew University-Hadassah Medical School, and the Interdisciplinary Center for Neural Computation, Hebrew University, Jerusalem 91120, Israel
Abstract
Cite
Mendelson, S., and I. Nelken. “Recurrence Methods in the Analysis of Learning Processes”. Neural Computation, vol. 13, no. 8, 2001, pp. 1839-61, https://doi.org/10.1162/08997660152469378.
Mendelson, S., & Nelken, I. (2001). Recurrence Methods in the Analysis of Learning Processes. Neural Computation, 13(8), 1839-1861. https://doi.org/10.1162/08997660152469378
Mendelson S, Nelken I. Recurrence Methods in the Analysis of Learning Processes. Neural Computation. 2001;13(8):1839-61.
Journal Categories
Medicine
Internal medicine
Neurosciences
Biological psychiatry
Neuropsychiatry
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Technology
Electrical engineering
Electronics
Nuclear engineering
Electronics
Technology
Mechanical engineering and machinery
Description

How can we ensure a machine consistently reaches correct states during learning? This paper presents a condition ensuring that a learning process visits the target set infinitely often, with near certainty. Easy to verify, this condition applies to numerous well-known learning rules, spanning the perceptron, continuous energy functions, Kohonen rule, and committee machine. We present a condition that ensures that the process visits the target set infinitely often almost surely. This condition is easy to verify and is true for many well-known learning rules. The utility of this method is demonstrated through its application to four types of learning processes. This method helps show that the process enters this target set. To demonstrate the utility of this method, we apply it to four types of learning processes: the perceptron, learning rules governed by continuous energy functions, the Kohonen rule, and the committee machine.

Published in Neural Computation, this article aligns with the journal’s focus on theoretical and computational aspects of neural networks and learning systems. By providing a verifiable condition for ensuring that learning processes reach correct states, this research contributes to the ongoing development of more robust machine learning algorithms.

Refrences