Random Embedding Machines for Pattern Recognition

Article Properties
  • Language
    English
  • Publication Date
    2001/11/01
  • Indian UGC (Journal)
  • Refrences
    10
  • Yoram Baram Department of Computer Science, Technion, Israel Institute of Technology, Haifa 32000, Israel
Abstract
Cite
Baram, Yoram. “Random Embedding Machines for Pattern Recognition”. Neural Computation, vol. 13, no. 11, 2001, pp. 2533-48, https://doi.org/10.1162/089976601753196012.
Baram, Y. (2001). Random Embedding Machines for Pattern Recognition. Neural Computation, 13(11), 2533-2548. https://doi.org/10.1162/089976601753196012
Baram Y. Random Embedding Machines for Pattern Recognition. Neural Computation. 2001;13(11):2533-48.
Journal Categories
Medicine
Internal medicine
Neurosciences
Biological psychiatry
Neuropsychiatry
Science
Mathematics
Instruments and machines
Electronic computers
Computer science
Technology
Electrical engineering
Electronics
Nuclear engineering
Electronics
Technology
Mechanical engineering and machinery
Description

Can we simplify pattern recognition by using random embeddings? This research introduces Random Embedding Machines. These machines address real classification problems involving structured data. It is shown that local clustering, a set of points of a given class is linearly separable from other classes. This is achieved by embedding the points in binary space. The embedding is done by a set of randomly parameterized surfaces, with high probability. This data set is called a local relative cluster. The size of the embedding set is inversely proportional to the squared local clustering degree. This method performs as well as state-of the-art methods with much less time. By simplifying the learning problem, this research has implications for machine learning. It resolves long-standing questions about the complexity of random embedding. This approach is relevant for developing efficient pattern recognition systems.

This study on pattern recognition using Random Embedding Machines fits Neural Computation's scope, with its focus on neural networks and machine learning. The approach to simplifying classification problems and reducing computational complexity contributes to the journal's interest in efficient algorithms for neural computation.

Refrences