Interested in improving non-linear data classification? This paper introduces a new method called generalized discriminant analysis (GDA), utilizing a kernel function operator for non-linear discriminant analysis. The approach maps input vectors into a high-dimensional feature space. In this transformed space, linear properties allow for the extension of classical linear discriminant analysis (LDA) to non-linear discriminant analysis. The formulation is expressed as an eigenvalue problem resolution. The study showcases classification results for both simulated and real data, demonstrating the effectiveness of the GDA method. Ultimately, this research may advance the field of pattern recognition and machine learning. Further research could explore the application of GDA to various complex datasets and compare its performance with other non-linear classification techniques. Different kernel functions could be explored.
As a theoretical exploration of a new method of data classification, this paper aligns with Neural Computation's focus on computational and mathematical approaches to understanding neural systems. The journal's interests are in theoretical computer science and neuropsychiatry. The use of kernel function operators, eigenvalue problems, and simulations demonstrates the computational techniques.