Abstract:Unsupervised anomaly detection is coming into the spotlight these days in various practical domains due to the limited amount of anomaly data. One of the major approaches for it is a normalizing flow which pursues the invertible transformation of a complex distribution as images into an easy distribution as N(0, I). In fact, algorithms based on normalizing flow like FastFlow and CFLOW-AD establish state-of-the-art performance on unsupervised anomaly detection tasks. Nevertheless, we investigate these algorithms convert normal images into not N(0, I) as their destination, but an arbitrary normal distribution. Moreover, their performances are often unstable, which is highly critical for unsupervised tasks because data for validation are not provided. To break through these observations, we propose a simple solution AltUB which introduces alternating training to update the base distribution of normalizing flow for anomaly detection. AltUB effectively improves the stability of performance of normalizing flow. Furthermore, our method achieves the new state-of-the-art performance of the anomaly segmentation task on the MVTec AD dataset with 98.8% AUROC.