Abstract:The notion of reparametrizations of Weighted CSPs (WCSPs) (also known as equivalence-preserving transformations of WCSPs) is well-known and finds its use in many algorithms to approximate or bound the optimal WCSP value. In contrast, the concept of super-reparametrizations (which are changes of the weights that keep or increase the WCSP objective for every assignment) was already proposed but never studied in detail. To fill this gap, we present a number of theoretical properties of super-reparametrizations and compare them to those of reparametrizations. Furthermore, we propose a framework for computing upper bounds on the optimal value of the (maximization version of) WCSP using super-reparametrizations. We show that it is in principle possible to employ arbitrary (under some technical conditions) constraint propagation rules to improve the bound. For arc consistency in particular, the method reduces to the known Virtual AC (VAC) algorithm. Newly, we implemented the method for singleton arc consistency (SAC) and compared it to other strong local consistencies in WCSPs on a public benchmark. The results show that the bounds obtained from SAC are superior for many instance groups.
Abstract:Coordinate-wise minimization is a simple popular method for large-scale optimization. Unfortunately, for general (non-differentiable) convex problems it may not find global minima. We present a class of linear programs that coordinate-wise minimization solves exactly. We show that dual LP relaxations of several well-known combinatorial optimization problems are in this class and the method finds a global minimum with sufficient accuracy in reasonable runtimes. Moreover, for extensions of these problems that no longer are in this class the method yields reasonably good suboptima. Though the presented LP relaxations can be solved by more efficient methods (such as max-flow), our results are theoretically non-trivial and can lead to new large-scale optimization algorithms in the future.
Abstract:(Block-)coordinate minimization is an iterative optimization method which in every iteration finds a global minimum of the objective over a variable or a subset of variables, while keeping the remaining variables constant. While for some problems, coordinate minimization converges to a global minimum (e.g., convex differentiable objective), for general (non-differentiable) convex problems this may not be the case. Despite this drawback, (block-)coordinate minimization can be an acceptable option for large-scale non-differentiable convex problems; an example is methods to solve the linear programming relaxation of the discrete energy minimization problem (MAP inference in graphical models). When block-coordinate minimization is applied to a general convex problem, in every iteration the minimizer over the current coordinate block need not be unique and therefore a single minimizer must be chosen. We propose that this minimizer be chosen from the relative interior of the set of all minimizers over the current block. We show that this rule is not worse, in a certain precise sense, than any other rule.