In this paper, using a novel matrix factorization and simultaneous reduction to diagonal form approach (or in short simultaneous reduction approach), Accelerated Kernel Discriminant Analysis (AKDA) and Accelerated Kernel Subclass Discriminant Analysis (AKSDA) are proposed. Specifically, instead of performing the simultaneous reduction of the between- and within-class or subclass scatter matrices, the nonzero eigenpairs (NZEP) of the so-called core matrix, which is of relatively small dimensionality, and the Cholesky factorization of the kernel matrix are computed, achieving more than one order of magnitude speed up over kernel discriminant analysis (KDA). Moreover, consisting of a few elementary matrix operations and very stable numerical algorithms, AKDA and AKSDA offer improved classification accuracy. The experimental evaluation on various datasets confirms that the proposed approaches provide state-of-the-art performance in terms of both training time and classification accuracy.