Abstract:Approximate message passing (AMP) algorithms break a (high-dimensional) statistical problem into parts then repeatedly solve each part in turn, akin to alternating projections. A distinguishing feature is their asymptotic behaviours can be accurately predicted via their associated state evolution equations. Orthogonal AMP (OAMP) was recently developed to avoid the need for computing the so-called Onsager term in traditional AMP algorithms, providing two clear benefits: the derivation of an OAMP algorithm is both straightforward and more broadly applicable. OAMP was originally demonstrated for statistical problems with a single measurement vector and single transform. This paper extends OAMP to statistical problems with multiple measurement vectors (MMVs) and multiple transforms (MTs). We name the resulting algorithms as OAMP-MMV and OAMP-MT respectively, and their combination as augmented OAMP (A-OAMP). Whereas the extension of traditional AMP algorithms to such problems would be challenging, the orthogonal principle underpinning OAMP makes these extensions straightforward. The MMV and MT models are widely applicable to signal processing and communications. We present an example of MIMO relay system with correlated source data and signal clipping, which can be modelled as a joint MMV-MT system. While existing methods meet with difficulties in this example, OAMP offers an efficient solution with excellent performance.
Abstract:Approximate Message Passing (AMP) is an efficient iterative parameter-estimation technique for certain high-dimensional linear systems with non-Gaussian distributions, such as sparse systems. In AMP, a so-called Onsager term is added to keep estimation errors approximately Gaussian. Orthogonal AMP (OAMP) does not require this Onsager term, relying instead on an orthogonalization procedure to keep the current errors uncorrelated with (i.e., orthogonal to) past errors. In this paper, we show that the orthogonality in OAMP ensures that errors are "asymptotically independently and identically distributed Gaussian" (AIIDG). This AIIDG property, which is essential for the attractive performance of OAMP, holds for separable functions. We present a procedure to realize the required orthogonality for OAMP through Gram-Schmidt orthogonalization (GSO). We show that expectation propagation (EP), AMP, OAMP and some other algorithms can be unified under this orthogonality framework. The simplicity and generality of OAMP provide efficient solutions for estimation problems beyond the classical linear models; related applications will be discussed in a companion paper where new algorithms are developed for problems with multiple constraints and multiple measurement variables.