Asymptotic Estimates in Information Theory with Non-Vanishing Error
Probabilities
V. Tan. (2015)cite arxiv:1504.02608Comment: Further comments welcome.
Abstract
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.
Description
[1504.02608] Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
%0 Journal Article
%1 tan2015asymptotic
%A Tan, Vincent Y. F.
%D 2015
%K asymptotics book estimation information readings theory
%T Asymptotic Estimates in Information Theory with Non-Vanishing Error
Probabilities
%U http://arxiv.org/abs/1504.02608
%X This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.
@article{tan2015asymptotic,
abstract = {This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.},
added-at = {2020-05-23T11:57:11.000+0200},
author = {Tan, Vincent Y. F.},
biburl = {https://www.bibsonomy.org/bibtex/2f6dcccc2fdecd152506dd8a9644b9e47/kirk86},
description = {[1504.02608] Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities},
interhash = {cfdf895bddcb20c6bb4411415e4cb938},
intrahash = {f6dcccc2fdecd152506dd8a9644b9e47},
keywords = {asymptotics book estimation information readings theory},
note = {cite arxiv:1504.02608Comment: Further comments welcome},
timestamp = {2020-05-23T11:57:11.000+0200},
title = {Asymptotic Estimates in Information Theory with Non-Vanishing Error
Probabilities},
url = {http://arxiv.org/abs/1504.02608},
year = 2015
}