TY - GEN
T1 - Information theoretic perspectives on synchronization
AU - Tchamkerten, Asian
AU - Khisti, Ashish
AU - Wornell, Gregory
PY - 2006/12/1
Y1 - 2006/12/1
N2 - We study the information theoretic limits of communication over asynchronous discrete memoryless channels. The transmitter starts sending a block codeword of length TV at a time v uniformly distributed within the interval [1, 2, . . ., L]. We assume that the receiver knows L but not v. We give a scaling law of L with respect to N for which reliable communication can be achieved. Specifically, we propose a communication scheme with the property that, unless the asynchrony level L grows at least as eNC, where C denotes the capacity of the synchronized channel, arbitrary low error probability can be achieved. If L grows sub-exponentially in N, the capacity is the same as that of the ordinary synchronized channel. Further, we provide a lower bound to the error probability given a certain channel, codebook, and asynchrony level. This bound together with our scheme shows that, in certain cases, the condition L ≤ eNC(1-δ) for any δ > 0 is an asymptotic necessary and sufficient condition for reliable communication. Finally we extend our analysis to a simple scenario where communication is carried over a Gaussian channel with antipodal signaling +√P and -√P. We show that a necessary condition on the amount of power needed in order to guarantee reliable communication is that P must scale as 1/N log L when L → ∞.
AB - We study the information theoretic limits of communication over asynchronous discrete memoryless channels. The transmitter starts sending a block codeword of length TV at a time v uniformly distributed within the interval [1, 2, . . ., L]. We assume that the receiver knows L but not v. We give a scaling law of L with respect to N for which reliable communication can be achieved. Specifically, we propose a communication scheme with the property that, unless the asynchrony level L grows at least as eNC, where C denotes the capacity of the synchronized channel, arbitrary low error probability can be achieved. If L grows sub-exponentially in N, the capacity is the same as that of the ordinary synchronized channel. Further, we provide a lower bound to the error probability given a certain channel, codebook, and asynchrony level. This bound together with our scheme shows that, in certain cases, the condition L ≤ eNC(1-δ) for any δ > 0 is an asymptotic necessary and sufficient condition for reliable communication. Finally we extend our analysis to a simple scenario where communication is carried over a Gaussian channel with antipodal signaling +√P and -√P. We show that a necessary condition on the amount of power needed in order to guarantee reliable communication is that P must scale as 1/N log L when L → ∞.
UR - https://www.scopus.com/pages/publications/39049150867
U2 - 10.1109/ISIT.2006.261616
DO - 10.1109/ISIT.2006.261616
M3 - Conference contribution
AN - SCOPUS:39049150867
SN - 1424405041
SN - 9781424405046
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 371
EP - 375
BT - Proceedings - 2006 IEEE International Symposium on Information Theory, ISIT 2006
T2 - 2006 IEEE International Symposium on Information Theory, ISIT 2006
Y2 - 9 July 2006 through 14 July 2006
ER -