Abstract:In this paper, we relate the feedback capacity of parallel additive colored Gaussian noise (ACGN) channels to a variant of the Kalman filter. By doing so, we obtain lower bounds on the feedback capacity of such channels, as well as the corresponding feedback (recursive) coding schemes, which are essentially power allocation policies with feedback, to achieve the bounds. The results are seen to reduce to existing lower bounds in the case of a single ACGN feedback channel, whereas when it comes to parallel additive white Gaussian noise (AWGN) channels with feedback, the recursive coding scheme reduces to a feedback "water-filling" power allocation policy.
Abstract:This paper is on the application of information theory to the analysis of fundamental lower bounds on the maximum deviations in feedback control systems, where the plant is linear time-invariant while the controller can generically be any causal functions as long as it stabilizes the plant. It is seen in general that the lower bounds are characterized by the unstable poles (or nonminimum-phase zeros) of the plant as well as the conditional entropy of the disturbance. Such bounds provide fundamental limits on how short the distribution tails in control systems can be made by feedback.
Abstract:In this short note, we investigate the feedback control of relativistic dynamics propelled by mass ejection, modeling, e.g., the relativistic rocket control or the relativistic (space-travel) flight control. As an extreme case, we also examine the control of relativistic photon rockets which are propelled by ejecting photons.
Abstract:In this short note, we introduce the spectral-domain $\mathcal{W}_2$ Wasserstein distance for elliptical stochastic processes in terms of their power spectra. We also introduce the spectral-domain Gelbrich bound for processes that are not necessarily elliptical.
Abstract:This short note is on a property of the $\mathcal{W}_2$ Wasserstein distance which indicates that independent elliptical distributions minimize their $\mathcal{W}_2$ Wasserstein distance from given independent elliptical distributions with the same density generators. Furthermore, we examine the implications of this property in the Gelbrich bound when the distributions are not necessarily elliptical. Meanwhile, we also generalize the results to the cases when the distributions are not independent. The primary purpose of this note is for the referencing of papers that need to make use of this property or its implications.
Abstract:This short note is on a property of the Kullback-Leibler (KL) divergence which indicates that independent Gaussian distributions minimize the KL divergence from given independent Gaussian distributions. The primary purpose of this note is for the referencing of papers that need to make use of this property entirely or partially.
Abstract:In this paper, we study the fundamental limits of obfuscation in terms of privacy-distortion tradeoffs for linear Gaussian dynamical systems via an information-theoretic approach. Particularly, we obtain analytical formulas that capture the fundamental privacy-distortion tradeoffs when privacy masks are to be added to the outputs of the dynamical systems, while indicating explicitly how to design the privacy masks in an optimal way: The privacy masks should be colored Gaussian with power spectra shaped specifically based upon the system and noise properties.
Abstract:In this paper, we first introduce the notion of channel leakage as the minimum mutual information between the channel input and channel output. As its name indicates, channel leakage quantifies the (minimum) information leakage to the malicious receiver. In a broad sense, it can be viewed as a dual concept of channel capacity, which characterizes the (maximum) information transmission to the targeted receiver. We obtain explicit formulas of channel leakage for the white Gaussian case and colored Gaussian case. We also study the implications of channel leakage in characterizing the fundamental limitations of privacy leakage for streaming data.
Abstract:In this paper, we examine the fundamental performance limitations of online machine learning, by viewing the online learning problem as a prediction problem with causal side information. Towards this end, we combine the entropic analysis from information theory and the innovations approach from prediction theory to derive generic lower bounds on the prediction errors as well as the conditions (in terms of, e.g., directed information) to achieve the bounds. It is seen in general that no specific restrictions have to be imposed on the learning algorithms or the distributions of the data points for the performance bounds to be valid. In addition, the cases of supervised learning, semi-supervised learning, as well as unsupervised learning can all be analyzed accordingly. We also investigate the implications of the results in analyzing the fundamental limits of generalization.
Abstract:In this paper, we establish a connection between the feedback capacity of additive colored Gaussian noise channels and the Kalman filters with additive colored Gaussian noises. In light of this, we are able to provide lower bounds on feedback capacity of such channels with finite-order auto-regressive moving average colored noises, and the bounds are seen to be consistent with various existing results in the literature; particularly, the bound is tight in the case of first-order auto-regressive moving average colored noises. On the other hand, the Kalman filtering systems, after certain equivalence transformations, can be employed as recursive coding schemes/algorithms to achieve the lower bounds. In general, our results provide an alternative perspective while pointing to potentially tighter bounds for the feedback capacity problem.