https://github.com/sbelharbi/Deep-Ordinal-Classification-with-Inequality-Constraints).
This study investigates a new constrained-optimization formulation for deep ordinal classification. We impose uni-modality of the label distribution implicitly via a set of inequality constraints over pairs of adjacent labels. To tackle the ensuing challenging optimization problem, we solve a sequence of unconstrained losses based on a powerful extension of the log-barrier method. This accommodates standard SGD for deep networks, and avoids computationally expensive Lagrangian dual steps and projections, while outperforming substantially penalty methods. Our non-parametric model is more flexible than the existing deep ordinal classification techniques: it does not restrict the learned representation to a specific parametric model, allowing the training to explore larger spaces of solutions and removing the need for ad hoc choices, while scaling up to large numbers of labels. It can be used in conjunction with any standard classification loss and any deep architecture. We also propose a new performance metric for ordinal classification, as a proxy to measure a distribution uni-modality, referred to as the Sides Order Index (SOI). We report comprehensive evaluations and comparisons to state-of-the-art methods on benchmark public datasets for several ordinal classification tasks, showing the merits of our approach in terms of label consistency and scalability. A public reproducible PyTorch implementation is provided (