Predictability, Complexity, and Learning

Article Properties
  • Language
    English
  • Publication Date
    2001/11/01
  • Indian UGC (Journal)
  • Refrences
    69
  • Citations
    246
  • William Bialek NEC Research Institute, Princeton, NJ 08540, U.S.A.
  • Ilya Nemenman NEC Research Institute, Princeton, New Jersey 08540, U.S.A., and Department of Physics, Princeton University, Princeton, NJ 08544, U.S.A.
  • Naftali Tishby NEC Research Institute, Princeton, NJ 08540, U.S.A., and School of Computer Science and Engineering and Center for Neural Computation, Hebrew University, Jerusalem 91904, Israel
Abstract
Cite
Bialek, William, et al. “Predictability, Complexity, and Learning”. Neural Computation, vol. 13, no. 11, 2001, pp. 2409-63, https://doi.org/10.1162/089976601753195969.
Bialek, W., Nemenman, I., & Tishby, N. (2001). Predictability, Complexity, and Learning. Neural Computation, 13(11), 2409-2463. https://doi.org/10.1162/089976601753195969
Bialek W, Nemenman I, Tishby N. Predictability, Complexity, and Learning. Neural Computation. 2001;13(11):2409-63.
Journal Categories
Medicine
Internal medicine
Neurosciences
Biological psychiatry
Neuropsychiatry
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Technology
Electrical engineering
Electronics
Nuclear engineering
Electronics
Technology
Mechanical engineering and machinery
Description

Can we quantify the complexity of learning? This paper introduces the concept of predictive information, Ipred(T), defined as the mutual information between the past and future of a time series, to characterize the complexity of learning. It reveals three distinct behaviors in the limit of large observation times: finite, logarithmic, or fractional power-law growth of Ipred(T). The goal is to find ways to measure complexity. The study finds that logarithmic growth occurs when learning a model with a finite number of parameters, with the coefficient representing the model space dimensionality. Power-law growth, conversely, is linked to learning infinite-parameter models, such as continuous functions with smoothness constraints. The research connects predictive information to complexity measures defined in learning theory and statistical mechanics. This framework could be applied to a variety of problems in physics, statistics, and biology, helping to quantify the complexity of underlying dynamics in diverse systems. The authors have argued that the divergent part of Ipred(T) is an accurate measurement of a times series complexity.

Published in Neural Computation, this paper fits the journal's scope by addressing computational aspects of learning and information theory. The research, connecting predictive information with complexity measures, aligns with the interdisciplinary nature of neural computation, drawing from mathematics, physics, and computer science. The content relates to computer science, technology, and neural networks.

Refrences
Citations
Citations Analysis
The first research to cite this article was titled Complexity through nonextensivity and was published in 2001. The most recent citation comes from a 2024 study titled Complexity through nonextensivity . This article reached its peak citation in 2017 , with 19 citations.It has been cited in 116 different journals, 19% of which are open access. Among related journals, the Entropy cited this research the most, with 23 citations. The chart below illustrates the annual citation trends for this article.
Citations used this article by year