Abstract:We consider here a classification method that balances two objectives: large similarity within the samples in the cluster, and large dissimilarity between the cluster and its complement. The method, referred to as HNC or SNC, requires seed nodes, or labeled samples, at least one of which is in the cluster and at least one in the complement. Other than that, the method relies only on the relationship between the samples. The contribution here is the new method in the presence of noisy labels, based on HNC, called Confidence HNC, in which we introduce confidence weights that allow the given labels of labeled samples to be violated, with a penalty that reflects the perceived correctness of each given label. If a label is violated then it is interpreted that the label was noisy. The method involves a representation of the problem as a graph problem with hyperparameters that is solved very efficiently by the network flow technique of parametric cut. We compare the performance of the new method with leading algorithms on both real and synthetic data with noisy labels and demonstrate that it delivers improved performance in terms of classification accuracy as well as noise detection capability.