Abstract:In recent years, a large number of binarization methods have been developed, with varying performance generalization and strength against different benchmarks. In this work, to leverage on these methods, an ensemble of experts (EoE) framework is introduced, to efficiently combine the outputs of various methods. The proposed framework offers a new selection process of the binarization methods, which are actually the experts in the ensemble, by introducing three concepts: confidentness, endorsement and schools of experts. The framework, which is highly objective, is built based on two general principles: (i) consolidation of saturated opinions and (ii) identification of schools of experts. After building the endorsement graph of the ensemble for an input document image based on the confidentness of the experts, the saturated opinions are consolidated, and then the schools of experts are identified by thresholding the consolidated endorsement graph. A variation of the framework, in which no selection is made, is also introduced that combines the outputs of all experts using endorsement-dependent weights. The EoE framework is evaluated on the set of participating methods in the H-DIBCO'12 contest and also on an ensemble generated from various instances of grid-based Sauvola method with promising performance.
Abstract:Designing a fast and efficient optimization method with local optima avoidance capability on a variety of optimization problems is still an open problem for many researchers. In this work, the concept of a new global optimization method with an open implementation area is introduced as a Curved Space Optimization (CSO) method, which is a simple probabilistic optimization method enhanced by concepts of general relativity theory. To address global optimization challenges such as performance and convergence, this new method is designed based on transformation of a random search space into a new search space based on concepts of space-time curvature in general relativity theory. In order to evaluate the performance of our proposed method, an implementation of CSO is deployed and its results are compared on benchmark functions with state-of-the art optimization methods. The results show that the performance of CSO is promising on unimodal and multimodal benchmark functions with different search space dimension sizes.