Abstract:This paper presents a hybrid morphological neural network for regression tasks called linear dilation-erosion regression ($\ell$-DER). In few words, an $\ell$-DER model is given by a convex combination of the composition of linear and elementary morphological operators. As a result, they yield continuous piecewise linear functions and, thus, are universal approximators. Apart from introducing the $\ell$-DER models, we present three approaches for training these models: one based on stochastic descent gradient and two based on the difference of convex programming problems. Finally, we evaluate the performance of the $\ell$-DER model using 14 regression tasks. Although the approach based on SDG revealed faster than the other two, the $\ell$-DER trained using a disciplined convex-concave programming problem outperformed the others in terms of the least mean absolute error score.
Abstract:Mathematical morphology (MM) is a theory of non-linear operators used for the processing and analysis of images. Morphological neural networks (MNNs) are neural networks whose neurons compute morphological operators. Dilations and erosions are the elementary operators of MM. From an algebraic point of view, a dilation and an erosion are operators that commute respectively with the supremum and infimum operations. In this paper, we present the \textit{linear dilation-erosion perceptron} ($\ell$-DEP), which is given by applying linear transformations before computing a dilation and an erosion. The decision function of the $\ell$-DEP model is defined by adding a dilation and an erosion. Furthermore, training a $\ell$-DEP can be formulated as a convex-concave optimization problem. We compare the performance of the $\ell$-DEP model with other machine learning techniques using several classification problems. The computational experiments support the potential application of the proposed $\ell$-DEP model for binary classification tasks.
Abstract:In this work, we briefly revise the reduced dilation-erosion perceptron (r-DEP) models for binary classification tasks. Then, we present the so-called linear dilation-erosion perceptron (l-DEP), in which a linear transformation is applied before the application of the morphological operators. Furthermore, we propose to train the l-DEP classifier by minimizing a regularized hinge-loss function subject to concave-convex restrictions. A simple example is given for illustrative purposes.