TY - GEN
T1 - Compression of noisy signals with information bottlenecks
AU - Emad, Amin
AU - Milenkovic, Olgica
PY - 2013
Y1 - 2013
N2 - We consider a novel approach to the information bottleneck problem where the goal is to perform compression of a noisy signal, while retaining a significant amount of information about a correlated auxiliary signal. To facilitate analysis, we cast compression with side information as an optimization problem involving an information measure, which for jointly Gaussian random variables equals the classical mutual information. We provide closed form expressions for locally optimal linear compression schemes; in particular, we show that the optimal solutions are of the form of the product of an arbitrary full-rank matrix and the left eigenvectors corresponding to smallest eigenvalues of a matrix related to the signals' covariance matrices. In addition, we study the influence of the sparsity level of the Bernoulli-Gaussian noise on the compression rate. We also highlight the similarities and differences between the noisy bottleneck problem and canonical correlation analysis (CCA), as well as the Gaussian information bottleneck problem.
AB - We consider a novel approach to the information bottleneck problem where the goal is to perform compression of a noisy signal, while retaining a significant amount of information about a correlated auxiliary signal. To facilitate analysis, we cast compression with side information as an optimization problem involving an information measure, which for jointly Gaussian random variables equals the classical mutual information. We provide closed form expressions for locally optimal linear compression schemes; in particular, we show that the optimal solutions are of the form of the product of an arbitrary full-rank matrix and the left eigenvectors corresponding to smallest eigenvalues of a matrix related to the signals' covariance matrices. In addition, we study the influence of the sparsity level of the Bernoulli-Gaussian noise on the compression rate. We also highlight the similarities and differences between the noisy bottleneck problem and canonical correlation analysis (CCA), as well as the Gaussian information bottleneck problem.
UR - http://www.scopus.com/inward/record.url?scp=84893341177&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84893341177&partnerID=8YFLogxK
U2 - 10.1109/ITW.2013.6691344
DO - 10.1109/ITW.2013.6691344
M3 - Conference contribution
AN - SCOPUS:84893341177
SN - 9781479913237
T3 - 2013 IEEE Information Theory Workshop, ITW 2013
BT - 2013 IEEE Information Theory Workshop, ITW 2013
T2 - 2013 IEEE Information Theory Workshop, ITW 2013
Y2 - 9 September 2013 through 13 September 2013
ER -