Abstract:Previous research has indicated that deep neural network based models for time series classification (TSC) tasks are prone to overfitting. This issue can be mitigated by employing strategies that prevent the model from becoming overly confident in its predictions, such as label smoothing and confidence penalty. Building upon the concept of label smoothing, we propose a novel approach to generate more reliable soft labels, which we refer to as representation soft label smoothing. We apply label smoothing, confidence penalty, and our method representation soft label smoothing to several TSC models and compare their performance with baseline method which only uses hard labels for training. Our results demonstrate that the use of these enhancement techniques yields competitive results compared to the baseline method. Importantly, our method demonstrates strong performance across models with varying structures and complexities.