Abstract:Quadratic discriminant analysis (QDA) is a widely used method for classification problems, particularly preferable over Linear Discriminant Analysis (LDA) for heterogeneous data. However, QDA loses its effectiveness in high-dimensional settings, where the data dimension and sample size tend to infinity. To address this issue, we propose a novel QDA method utilizing spectral correction and regularization techniques, termed SR-QDA. The regularization parameters in our method are selected by maximizing the Fisher-discriminant ratio. We compare SR-QDA with QDA, regularized quadratic discriminant analysis (R-QDA), and several other competitors. The results indicate that SR-QDA performs exceptionally well, especially in moderate and high-dimensional situations. Empirical experiments across diverse datasets further support this conclusion.
Abstract:In this paper, we propose an improved linear discriminant analysis, called spectrally-corrected and regularized linear discriminant analysis (SCRLDA). This method integrates the design ideas of the sample spectrally-corrected covariance matrix and the regularized discriminant analysis. The SCRLDA method is specially designed for classification problems under the assumption that the covariance matrix follows a spiked model. Through the real and simulated data analysis, it is shown that our proposed classifier outperforms the classical R-LDA and can be as competitive as the KNN, SVM classifiers while requiring lower computational complexity.