42
42
Sep 22, 2013
09/13
by
Ofer Shayevitz
texts
eye 42
favorite 0
comment 0
It is known that modeling an information source via a symbolic dynamical system evolving over the unit interval, leads to a natural lossless compression scheme attaining the entropy rate of the source, under general conditions. We extend this notion to the lossy compression regime assuming a feedforward link is available, by modeling a source via a two-dimensional symbolic dynamical system where one component corresponds to the compressed signal, and the other essentially corresponds to the...
Source: http://arxiv.org/abs/1001.3486v1
50
50
Sep 23, 2013
09/13
by
Ofer Shayevitz
texts
eye 50
favorite 0
comment 0
The R\'enyi information measures are characterized in terms of their Shannon counterparts, and properties of the former are recovered from first principle via the associated properties of the latter. Motivated by this characterization, a two-sensor composite hypothesis testing problem is presented, and the optimal worst case miss-detection exponent is obtained in terms of a R\'enyi divergence.
Source: http://arxiv.org/abs/1012.4401v1
3
3.0
Jun 30, 2018
06/18
by
Or Ordentlich; Ofer Shayevitz
texts
eye 3
favorite 0
comment 0
The binary adder is a two-user multiple access channel whose inputs are binary and whose output is the real sum of the inputs. While the Shannon capacity region of this channel is well known, little is known regarding its zero-error capacity region, and a large gap remains between the best inner and outer bounds. In this paper, we provide an improved outer bound for this problem. To that end, we introduce a soft variation of the Saur-Perles-Shelah Lemma, that is then used in conjunction with an...
Topics: Mathematics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1412.8670
10
10.0
Jun 29, 2018
06/18
by
Lele Wang; Ofer Shayevitz
texts
eye 10
favorite 0
comment 0
We introduce the notion of information ratio $\text{Ir}(H/G)$ between two (simple, undirected) graphs $G$ and $H$, defined as the supremum of ratios $k/n$ such that there exists a mapping between the strong products $G^k$ to $H^n$ that preserves non-adjacency. Operationally speaking, the information ratio is the maximal number of source symbols per channel use that can be reliably sent over a channel with a confusion graph $H$, where reliability is measured w.r.t. a source confusion graph $G$....
Topics: Information Theory, Discrete Mathematics, Combinatorics, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1612.09343
4
4.0
Jun 30, 2018
06/18
by
Or Ordentlich; Ofer Shayevitz
texts
eye 4
favorite 0
comment 0
A lossy source code $\mathcal{C}$ with rate $R$ for a discrete memoryless source $S$ is called subset-universal if for every $0
Topics: Mathematics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1411.0443
3
3.0
Jun 30, 2018
06/18
by
Amir Burin; Ofer Shayevitz
texts
eye 3
favorite 0
comment 0
Alice holds an r.v. $X$, and Bob is trying to guess its value by asking questions of the form "is $X=x$?". Alice answers truthfully and the game terminates once Bob guesses correctly. Before the game begins, Bob is allowed to reach out to an oracle, Carole, and ask her any yes/no question, i.e., a question of the form "is $X\in A$?". Carole is known to lie with a given probability $0\leq p\leq 1/2$. What should Bob ask Carole if he would like to minimize his expected...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1703.01672
5
5.0
Jun 30, 2018
06/18
by
Or Ordentlich; Ofer Shayevitz
texts
eye 5
favorite 0
comment 0
Let $\mathcal{F}_1$ and $\mathcal{F}_2$ be two families of subsets of an $n$-element set. We say that $\mathcal{F}_1$ and $\mathcal{F}_2$ are multiset-union-free if for any $A,B\in \mathcal{F}_1$ and $C,D\in \mathcal{F}_2$ the multisets $A\uplus C$ and $B\uplus D$ are different, unless both $A = B$ and $C= D$. We derive a new upper bound on the maximal sizes of multiset-union-free pairs, improving a result of Urbanke and Li.
Topics: Mathematics, Discrete Mathematics, Combinatorics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1412.8415
3
3.0
Jun 29, 2018
06/18
by
Nir Weinberger; Ofer Shayevitz
texts
eye 3
favorite 0
comment 0
Suppose $Y^{n}$ is obtained by observing a uniform Bernoulli random vector $X^{n}$ through a binary symmetric channel. Courtade and Kumar asked how large the mutual information between $Y^{n}$ and a Boolean function $\mathsf{b}(X^{n})$ could be, and conjectured that the maximum is attained by a dictator function. An equivalent formulation of this conjecture is that dictator minimizes the prediction cost in a sequential prediction of $Y^{n}$ under logarithmic loss, given $\mathsf{b}(X^{n})$. In...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1607.02381
14
14
Jun 27, 2018
06/18
by
Or Ordentlich; Ofer Shayevitz
texts
eye 14
favorite 0
comment 0
Mrs. Gerber's Lemma lower bounds the entropy at the output of a binary symmetric channel in terms of the entropy of the input process. In this paper, we lower bound the output entropy via a different measure of input uncertainty, pertaining to the minimum mean squared error (MMSE) prediction cost of the input process. We show that in many cases our bound is tighter than the one obtained from Mrs. Gerber's Lemma. As an application, we evaluate the bound for binary hidden Markov processes, and...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1506.00253
10
10.0
Jun 28, 2018
06/18
by
Ofer Shayevitz; Meir Feder
texts
eye 10
favorite 0
comment 0
Posterior matching (PM) is a sequential horizon-free feedback communication scheme introduced by the authors, who also provided a rather involved optimality proof showing it achieves capacity for a large class of memoryless channels. Naghshvar et al considered a non-sequential variation of PM with a fixed number of messages and a random decision-time, and gave a simpler proof establishing its optimality via a novel Shannon-Jensen divergence argument. Another simpler optimality proof was given...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1507.08929
48
48
Sep 21, 2013
09/13
by
Ofer Shayevitz; Meir Feder
texts
eye 48
favorite 0
comment 0
We address the problem of universal communications over an unknown channel with an instantaneous noiseless feedback, and show how rates corresponding to the empirical behavior of the channel can be attained, although no rate can be guaranteed in advance. First, we consider a discrete modulo-additive channel with alphabet $\mathcal{X}$, where the noise sequence $Z^n$ is arbitrary and unknown and may causally depend on the transmitted and received sequences and on the encoder's message, possibly...
Source: http://arxiv.org/abs/0808.4135v2
60
60
Sep 19, 2013
09/13
by
Ofer Shayevitz; Meir Feder
texts
eye 60
favorite 0
comment 0
In this paper we introduce a fundamental principle for optimal communication over general memoryless channels in the presence of noiseless feedback, termed posterior matching. Using this principle, we devise a (simple, sequential) generic feedback transmission scheme suitable for a large class of memoryless channels and input distributions, achieving any rate below the corresponding mutual information. This provides a unified framework for optimal feedback communication in which the Horstein...
Source: http://arxiv.org/abs/0909.4828v2
47
47
Sep 23, 2013
09/13
by
Ofer Shayevitz; Michele Wigger
texts
eye 47
favorite 0
comment 0
A coding scheme for the discrete memoryless broadcast channel with {noiseless, noisy, generalized} feedback is proposed, and the associated achievable region derived. The scheme is based on a block-Markov strategy combining the Marton scheme and a lossy version of the Gray-Wyner scheme with side-information. In each block the transmitter sends fresh data and update information that allows the receivers to improve the channel outputs observed in the previous block. For a generalization of...
Source: http://arxiv.org/abs/1012.6012v3
4
4.0
Jun 29, 2018
06/18
by
Sihuang Hu; Ofer Shayevitz
texts
eye 4
favorite 0
comment 0
Motivated by the problem of zero-error broadcasting, we introduce a new notion of graph capacity, termed $\rho$-capacity, that generalizes the Shannon capacity of a graph. We derive upper and lower bounds on the $\rho$-capacity of arbitrary graphs, and provide a Lov\'asz-type upper bound for regular graphs. We study the behavior of the $\rho$-capacity under two graph operations: the strong product and the disjoint union. Finally, we investigate the connection between the structure of a graph...
Topics: Information Theory, Combinatorics, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1607.07263
9
9.0
Jun 26, 2018
06/18
by
Assaf Ben-Yishai; Ofer Shayevitz
texts
eye 9
favorite 0
comment 0
Consider a pair of terminals connected by two independent (feedforward and feedback) Additive White Gaussian Noise (AWGN) channels, and limited by individual power constraints. The first terminal would like to reliably send information to the second terminal at a given rate. While the reliability in the cases of no feedback and of noiseless feedback is well studied, not much is known about the case of noisy feedback. In this work, we present an interactive scheme that significantly improves the...
Topics: Computing Research Repository, Mathematics, Information Theory
Source: http://arxiv.org/abs/1501.06671
4
4.0
Jun 30, 2018
06/18
by
Assaf Ben-Yishai; Ofer Shayevitz
texts
eye 4
favorite 0
comment 0
Consider a pair of terminals connected by two independent additive white Gaussian noise channels, and limited by individual power constraints. The first terminal would like to reliably send information to the second terminal, within a given error probability. We construct an explicit interactive scheme consisting of only (non-linear) scalar operations, by endowing the Schalkwijk-Kailath noiseless feedback scheme with modulo arithmetic. Our scheme achieves a communication rate close to the...
Topics: Mathematics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1407.8022
9
9.0
Jun 28, 2018
06/18
by
Assaf Ben-Yishai; Ofer Shayevitz
texts
eye 9
favorite 0
comment 0
We study the problem of communication over an additive white Gaussian noise (AWGN) channel with an AWGN feedback channel. When the feedback channel is noiseless, the classic Schalkwijk-Kailath (S-K) scheme is known to achieve capacity in a simple sequential fashion, while attaining reliability superior to non-feedback schemes. In this work, we show how simplicity and reliability can be attained even when the feedback is noisy, provided that the feedback channel is sufficiently better than the...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1509.03085
12
12
Jun 27, 2018
06/18
by
Assaf Ben-Yishai; Ofer Shayevitz
texts
eye 12
favorite 0
comment 0
We consider the problem of communication over a two-user Additive White Gaussian Noise Broadcast Channel (AWGN-BC) with an AWGN Multiple Access (MAC) active feedback. We describe a constructive reduction from this setup to the well-studied setup of linear-feedback coding over the AWGN-BC with noiseless feedback (and different parameters). This reduction facilitates the design of linear-feedback coding schemes in the (passive) noiseless feedback regime, which can then be easily and...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1503.05297
4
4.0
Jun 29, 2018
06/18
by
Yonatan Kaspi; Ofer Shayevitz; Tara Javidi
texts
eye 4
favorite 0
comment 0
Consider a target moving at a constant velocity on a unit-circumference circle, starting at an arbitrary location. To acquire the target, any region of the circle can be probed to obtain a noisy measurement of the target's presence, where the noise level increases with the size of the probed region. We are interested in the expected time required to find the target to within some given resolution and error probability. For a known velocity, we characterize the optimal tradeoff between time and...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1611.08959
4
4.0
Jun 30, 2018
06/18
by
Yonatan Kaspi; Ofer Shayevitz; Tara Javidi
texts
eye 4
favorite 0
comment 0
Consider a target moving with a constant velocity on a unit-circumference circle, starting from an arbitrary location. To acquire the target, any region of the circle can be probed for its presence, but the associated measurement noise increases with the size of the probed region. We are interested in the expected time required to find the target to within some given resolution and error probability. For a known velocity, we characterize the optimal tradeoff between time and resolution (i.e.,...
Topics: Mathematics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1408.4073
92
92
Jul 20, 2013
07/13
by
Ofer Shayevitz; Ram Zamir; Meir Feder
texts
eye 92
favorite 0
comment 0
We address the problem of delay in an arithmetic coding system. Due to the nature of the arithmetic coding process, source sequences causing arbitrarily large encoding or decoding delays exist. This phenomena raises the question of just how large is the expected input to output delay in these systems, i.e., once a source sequence has been encoded, what is the expected number of source letters that should be further encoded to allow full decoding of that sequence. In this paper, we derive...
Source: http://arxiv.org/abs/cs/0604106v1
3
3.0
Jun 30, 2018
06/18
by
Sihuang Hu; Nir Weinberger; Ofer Shayevitz
texts
eye 3
favorite 0
comment 0
We investigate the asymptotic rates of length-$n$ binary codes with VC-dimension at most $dn$ and minimum distance at least $\delta n$. Two upper bounds are obtained, one as a simple corollary of a result by Haussler and the other via a shortening approach combining Sauer-Shelah lemma and the linear programming bound. Two lower bounds are given using Gilbert-Varshamov type arguments over constant-weight and Markov-type sets.
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1703.01586
11
11
Jun 28, 2018
06/18
by
Yanjun Han; Or Ordentlich; Ofer Shayevitz
texts
eye 11
favorite 0
comment 0
The mutual information between two jointly distributed random variables $X$ and $Y$ is a functional of the joint distribution $P_{XY},$ which is sometimes difficult to handle or estimate. A coarser description of the statistical behavior of $(X,Y)$ is given by the marginal distributions $P_X, P_Y$ and the adjacency relation induced by the joint distribution, where $x$ and $y$ are adjacent if $P(x,y)>0$. We derive a lower bound on the mutual information in terms of these entities. The bound...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1507.06296
9
9.0
Jun 27, 2018
06/18
by
Or Ordentlich; Ofer Shayevitz; Omri Weinstein
texts
eye 9
favorite 0
comment 0
Suppose $X$ is a uniformly distributed $n$-dimensional binary vector and $Y$ is obtained by passing $X$ through a binary symmetric channel with crossover probability $\alpha$. A recent conjecture by Courtade and Kumar postulates that $I(f(X);Y)\leq 1-h(\alpha)$ for any Boolean function $f$. So far, the best known upper bound was $I(f(X);Y)\leq (1-2\alpha)^2$. In this paper, we derive a new upper bound that holds for all balanced functions, and improves upon the best known bound for all...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1505.05794
70
70
Sep 23, 2013
09/13
by
Ofer Shayevitz; Eado Meron; Meir Feder; Ram Zamir
texts
eye 70
favorite 0
comment 0
The penalty incurred by imposing a finite delay constraint in lossless source coding of a memoryless source is investigated. It is well known that for the so-called block-to-variable and variable-to-variable codes, the redundancy decays at best polynomially with the delay, which in this case is identified with the block or maximal phrase length, respectively. In stark contrast, it is shown that the redundancy can be made to decay exponentially with the delay constraint. The corresponding...
Source: http://arxiv.org/abs/1012.4225v1