TY - GEN
T1 - From almost Gaussian to Gaussian
AU - Costa, Max H.M.
AU - Rioul, Olivier
N1 - Publisher Copyright:
© 2015 AIP Publishing LLC.
PY - 2015/1/1
Y1 - 2015/1/1
N2 - We consider lower and upper bounds on the difference of differential entropies of a Gaussian random vector and an approximately Gaussian random vector after they are "smoothed" by an arbitrarily distributed random vector of finite power. These bounds are important to establish the optimality of the corner points in the capacity region of Gaussian interference channels. A problematic issue in a previous attempt to establish these bounds was detected in 2004 and the mentioned corner points have since been dubbed "the missing corner points". The importance of the given bounds comes from the fact that they induce Fano-type inequalities for the Gaussian interference channel. Usual Fano inequalities are based on a communication requirement. In this case, the new inequalities are derived from a non-disturbance constraint. The upper bound on the difference of differential entropies is established by the data processing inequality (DPI). For the lower bound, we do not have a complete proof, but we present an argument based on continuity and the DPI.
AB - We consider lower and upper bounds on the difference of differential entropies of a Gaussian random vector and an approximately Gaussian random vector after they are "smoothed" by an arbitrarily distributed random vector of finite power. These bounds are important to establish the optimality of the corner points in the capacity region of Gaussian interference channels. A problematic issue in a previous attempt to establish these bounds was detected in 2004 and the mentioned corner points have since been dubbed "the missing corner points". The importance of the given bounds comes from the fact that they induce Fano-type inequalities for the Gaussian interference channel. Usual Fano inequalities are based on a communication requirement. In this case, the new inequalities are derived from a non-disturbance constraint. The upper bound on the difference of differential entropies is established by the data processing inequality (DPI). For the lower bound, we do not have a complete proof, but we present an argument based on continuity and the DPI.
KW - Gaussian approximation
KW - Gaussian interference channels
KW - Kullback-Leibler distance
KW - data processing inequality.
KW - the missing corner points
U2 - 10.1063/1.4905964
DO - 10.1063/1.4905964
M3 - Conference contribution
AN - SCOPUS:85063830999
T3 - AIP Conference Proceedings
SP - 67
EP - 73
BT - Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2014
A2 - Mohammad-Djafari, Ali
A2 - Barbaresco, Frederic
A2 - Barbaresco, Frederic
PB - American Institute of Physics Inc.
T2 - 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2014
Y2 - 21 September 2014 through 26 September 2014
ER -