A numerical approach is developed for detecting the equivalence of deep learning architectures. The method is based on generating Mixed Matrix Ensembles (MMEs) out of deep neural network weight matrices and {\it conjugate circular ensemble} matching the neural architecture topology. Following this, the empirical evidence supports the {\it phenomenon} that distance between spectral densities of neural architectures and corresponding {\it conjugate circular ensemble} are vanishing with different decay rates at long positive tail part of the spectrum i.e., cumulative Circular Spectral Distance (CSD). This finding can be used in establishing equivalences among different neural architectures via analysis of fluctuations in CSD. We investigated this phenomenon for wide range of deep learning vision architectures and with circular ensembles originating from statistical quantum mechanics. Practical implications of the proposed method for artificial and natural neural architectures discussed such as possibility of using the approach in Neural Architecture Search (NAS) and classification of biological neural networks.