Evolving Neural Networks through Augmenting Topologies

Article Properties
  • Language
    English
  • Publication Date
    2002/06/01
  • Indian UGC (Journal)
  • Refrences
    19
  • Citations
    893
  • Kenneth O. Stanley Department of Computer Sciences, The University of Texas at Austin, Austin, TX 78712, USA
  • Risto Miikkulainen Department of Computer Sciences, The University of Texas at Austin, Austin, TX 78712, USA
Abstract
Cite
Stanley, Kenneth O., and Risto Miikkulainen. “Evolving Neural Networks through Augmenting Topologies”. Evolutionary Computation, vol. 10, no. 2, 2002, pp. 99-127, https://doi.org/10.1162/106365602320169811.
Stanley, K. O., & Miikkulainen, R. (2002). Evolving Neural Networks through Augmenting Topologies. Evolutionary Computation, 10(2), 99-127. https://doi.org/10.1162/106365602320169811
Stanley KO, Miikkulainen R. Evolving Neural Networks through Augmenting Topologies. Evolutionary Computation. 2002;10(2):99-127.
Journal Categories
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Computer software
Technology
Electrical engineering
Electronics
Nuclear engineering
Electronics
Computer engineering
Computer hardware
Description

Can evolving neural network topologies lead to significant performance gains? This paper presents a method, NeuroEvolution of Augmenting Topologies (NEAT), that outperforms fixed-topology methods on challenging reinforcement learning tasks. NEAT demonstrates increased efficiency through a principled method of crossover of different topologies, protection of structural innovation using speciation, and incremental growth from minimal structure. A series of ablation studies confirm the necessity of each component to the system as a whole and to each other, resulting in significantly faster learning. NEAT also offers the possibility of evolving increasingly complex solutions over generations, strengthening the analogy with biological evolution. NEAT represents a significant contribution to Genetic Algorithms (GAs), showing how evolution can simultaneously optimize and complexify solutions, providing new avenues for evolving complex solutions over successive generations.

Published in Evolutionary Computation, this paper fits the journal's focus on advancing computational methods inspired by biological evolution. The NEAT method contributes to the field by demonstrating how evolving neural network topologies can enhance learning efficiency, aligning with the journal's core interests.

Refrences
Citations
Citations Analysis
The first research to cite this article was titled Speeding Up Backpropagation Using Multiobjective Evolutionary Algorithms and was published in 2003. The most recent citation comes from a 2024 study titled Speeding Up Backpropagation Using Multiobjective Evolutionary Algorithms . This article reached its peak citation in 2023 , with 123 citations.It has been cited in 333 different journals, 18% of which are open access. Among related journals, the IEEE Transactions on Evolutionary Computation cited this research the most, with 37 citations. The chart below illustrates the annual citation trends for this article.
Citations used this article by year