A brain can detect outlier just by using only normal samples. Similarly, one-class classification (OCC) also uses only normal samples to train the model and trained model can be used for outlier detection. In this paper, a multi-layer architecture for OCC is proposed by stacking various Graph-Embedded Kernel Ridge Regression (KRR) based Auto-Encoders in a hierarchical fashion. These Auto-Encoders are formulated under two types of Graph-Embedding, namely, local and global variance-based embedding. This Graph-Embedding explores the relationship between samples and multi-layers of Auto-Encoder project the input features into new feature space. The last layer of this proposed architecture is Graph-Embedded regression-based one-class classifier. The Auto-Encoders use an unsupervised approach of learning and the final layer uses semi-supervised (trained by only positive samples and obtained closed-form solution) approach to learning. The proposed method is experimentally evaluated on 21 publicly available benchmark datasets. Experimental results verify the effectiveness of the proposed one-class classifiers over 11 existing state-of-the-art kernel-based one-class classifiers. Friedman test is also performed to verify the statistical significance of the claim of the superiority of the proposed one-class classifiers over the existing state-of-the-art methods. By using two types of Graph-Embedding, 4 variants of Graph-Embedded multi-layer KRR-based one-class classifier has been presented in this paper. All 4 variants performed better than the existing one-class classifiers in terms of various discussed criteria in this paper. Hence, it can be a viable alternative for OCC task. In the future, various other types of Auto-Encoders can be explored within proposed architecture.