Abstract:In this paper, we propose a new weight initialization method called even initialization for wide and deep nonlinear neural networks with the ReLU activation function. We prove that no poor local minimum exists in the initial loss landscape in the wide and deep nonlinear neural network initialized by the even initialization method that we propose. Specifically, in the initial loss landscape of such a wide and deep ReLU neural network model, the following four statements hold true: 1) the loss function is non-convex and non-concave; 2) every local minimum is a global minimum; 3) every critical point that is not a global minimum is a saddle point; and 4) bad saddle points exist. We also show that the weight values initialized by the even initialization method are contained in those initialized by both of the (often used) standard initialization and He initialization methods.
Abstract:We survey the development of Clifford's geometric algebra and some of its engineering applications during the last 15 years. Several recently developed applications and their merits are discussed in some detail. We thus hope to clearly demonstrate the benefit of developing problem solutions in a unified framework for algebra and geometry with the widest possible scope: from quantum computing and electromagnetism to satellite navigation, from neural computing to camera geometry, image processing, robotics and beyond.