• Nem Talált Eredményt

JJ II

N/A
N/A
Protected

Academic year: 2022

Ossza meg "JJ II"

Copied!
13
0
0

Teljes szövegt

(1)

volume 4, issue 5, article 93, 2003.

Received 03 April, 2003;

accepted 21 October, 2003.

Communicated by:S.S. Dragomir

Abstract Contents

JJ II

J I

Home Page Go Back

Close Quit

Journal of Inequalities in Pure and Applied Mathematics

AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY

PETER HARREMOËS AND CHRISTOPHE VIGNAT

Department of Mathematics, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen, Denmark.

EMail:moes@math.ku.dk

University of Copenhagen and Université de Marne la Vallée, 77454 Marne la Vallée

Cedex 2, France.

EMail:vignat@univ-mlv.fr

c

2000Victoria University ISSN (electronic): 1443-5756 043-03

(2)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page2of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

Abstract

In this paper, we prove that the classical Entropy Power Inequality, as derived in the continuous case, can be extended to the discrete family of binomial random variables with parameter1/2.

2000 Mathematics Subject Classification:94A17

Key words: Entropy Power Inequality, Discrete random variable

The first author is supported by a post-doc fellowship from the Villum Kann Ras- mussen Foundation and INTAS (project 00-738) and Danish Natural Science Coun- cil.

This work was done during a visit of the second author at Dept. of Math., University of Copenhagen in March 2003.

Contents

1 Introduction. . . 3

2 Superadditivity . . . 4

3 An Information Theoretic Inequality . . . 5

4 Proof of the Main Theorem . . . 9

5 Acknowledgements. . . 11 References

(3)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page3of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

1. Introduction

The continuous Entropy Power Inequality

(1.1) e2h(X)+e2h(Y) ≤e2h(X+Y)

was first stated by Shannon [1] and later proved by Stam [2] and Blachman [3].

Later, several related inequalities for continuous variables were proved in [4], [5] and [6]. There have been several attempts to provide discrete versions of the Entropy Power Inequality: in the case of Bernoulli sources with addition modulo 2, results have been obtained in a series of papers [7], [8], [9] and [11].

In general, inequality (1.1) does not hold when X and Y are discrete ran- dom variables and the differential entropy is replaced by the discrete entropy: a simple counterexample is provided whenX andY are deterministic.

In what follows, Xn ∼ B n,12

denotes a binomial random variable with parametersnand 12,and we prove our main theorem:

Theorem 1.1. The sequenceXnsatisfies the following Entropy Power Inequal- ity

∀m, n≥1, e2H(Xn)+e2H(Xm) ≤e2H(Xn+Xm).

With this aim in mind, we use a characterization of the superadditivity of a function, together with an entropic inequality.

(4)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page4of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

2. Superadditivity

Definition 2.1. A functionnyYnis superadditive if

∀m, n Ym+n≥Ym+Yn.

A sufficient condition for superadditivity is given by the following result.

Proposition 2.1. If Ynn is increasing, thenYnis superadditive.

Proof. Takemandnand supposem≥n. Then by assumption Ym+n

m+n ≥ Ym m or

Ym+n ≥Ym+ n mYm. However, by the hypothesism≥n

Ym m ≥ Yn

n so that

Ym+n≥Ym+Yn.

In order to prove that the function

(2.1) Yn=e2H(Xn)

is superadditive, it suffices then to show that functionn y Ynn is increasing.

(5)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page5of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

3. An Information Theoretic Inequality

Denote asB ∼Ber(1/2)a Bernoulli random variable so that

(3.1) Xn+1 =Xn+B

and

(3.2) PXn+1 =PXn ∗PB = 1

2(PXn+PXn+1), wherePXn ={pnk}denotes the probability law ofXnwith

(3.3) pnk = 2−n

n k

.

A direct application of an equality by Topsøe [12] yields (3.4) H PXn+1

= 1

2H(PXn+1) + 1

2H(PXn) + 1

2D PXn+1||PXn+1 +1

2D PXn||PXn+1 .

Introduce the Jensen-Shannon divergence (3.5) J SD(P, Q) = 1

2D

P

P +Q 2

+1

2D

Q

P +Q 2

and remark that

(3.6) H(PXn) = H(PXn+1),

(6)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page6of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

since each distribution is a shifted version of the other. We conclude thus that

(3.7) H PXn+1

=H(PXn) +J SD(PXn+1, PXn),

showing that the entropy of a binomial law is an increasing function ofn.Now we need the stronger result that Ynn is an increasing sequence, or equivalently that

(3.8) log Yn+1

n+ 1 ≥logYn n or

(3.9) J SD(PXn+1, PXn)≥ 1

2logn+ 1 n .

We use the following expansion of the Jensen-Shannon divergence, due to B.Y.

Ryabko and reported in [13].

Lemma 3.1. The Jensen-Shannon divergence can be expanded as follows J SD(P, Q) = 1

2

X

ν=1

1

2ν(2ν−1)∆ν(P, Q) with

ν(P, Q) =

n

X

i=1

|pi −qi| (pi +qi)2ν−1.

This lemma, applied in the particular case whereP =PXn andQ =PXn+1 yields the following result.

(7)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page7of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

Lemma 3.2. The Jensen-Shannon divergence betweenPXn+1 andPXn can be expressed as

J SD(PXn+1, PXn) =

X

ν=1

1

ν(2ν−1)· 22ν−1 (n+ 1)m

B

n+ 1,1 2

,

where m B n+ 1,12

denotes the ordercentral moment of a binomial random variableB n+ 1,12

.

Proof. DenoteP =pi, Q=p+i andp¯i = (pi+p+i )/2. For the term∆ν(PXn+1, PXn) we have

ν(PXn+1, PXn) =

n

X

i=1

p+i −pi

p+i +pi2ν−1 = 2

n

X

i=1

p+i −pi

p+i +pi

¯ pi

and

p+i −pi p+i +pi

= 2−n i−1n

−2−n ni 2−n i−1n

+ 2−n ni = 2i−n−1 n+ 1 so that

ν(PXn+1, PXn) = 2

n

X

i=1

2i−n−1 n+ 1

¯ pi

= 2 2

n+ 1 n

X

i=1

i− n+ 1 2

¯ pi

= 22ν+1 (n+ 1)m

B

n+ 1,1 2

.

(8)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page8of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

Finally, the Jensen-Shannon divergence becomes J SD(PXn+1, PXn) = 1

4

+∞

X

ν=1

1

ν(2ν−1)∆ν(PXn+1, PXn)

=

+∞

X

ν=1

1

ν(2ν−1)· 22ν−1 (n+ 1)m

B

n+ 1,1 2

.

(9)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page9of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

4. Proof of the Main Theorem

We are now in a position to show that the function n y Ynn is increasing, or equivalently that inequality (3.9) holds.

Proof. We remark that it suffices to prove the following inequality

(4.1)

3

X

ν=1

1

ν(2ν−1) · 22ν−1 (n+ 1)m

B

n+ 1,1 2

≥ 1 2log

1 + 1

n

since the terms ν > 3in the expansion of the Jensen-Shannon divergence are all non-negative. Now an explicit computation of the three first even central moments of a binomial random variable with parametersn+ 1and 12 yields

m2 = n+ 1

4 , m4 = (n+ 1) (3n+ 1)

16 and m6 = (n+ 1) (15n2+ 1)

64 ,

so that inequality (4.1) becomes 1

60

30n4+ 135n3+ 245n2+ 145n+ 37

(n+ 1)5 ≥ 1

2log

1 + 1 n

.

Let us now upper-bound the right hand side as follows log

1 + 1

n

≤ 1 n − 1

2n2 + 1 3n3 so that it suffices to prove that

1

60· 30n4+ 135n3+ 245n2+ 145n+ 37

(n+ 1)5 − 1

2 1

n − 1

2n2 + 1 3n3

≥0.

(10)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page10of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

Rearranging the terms yields the equivalent inequality 1

60· 10n5−55n4−63n3−55n2−35n−10 (n+ 1)5n3 ≥0 which is equivalent to the positivity of polynomial

P (n) = 10n5−55n4−63n3−55n2−35n−10.

Assuming first thatn ≥7,we remark that P(n)≥10n5−n4

55 + 63 6 + 55

62 + 35 63 +10

64

=

10n− 5443 81

n4

whose positivity is ensured as soon asn ≥7.

This result can be extended to the values1≤n ≤6by a direct inspection at the values of functionn y Ynn as given in the following table.

n 1 2 3 4 5 6

e2H(Xn)

n 4 4 4.105 4.173 4.212 4.233 Table 1: Values of the functionny Ynn for1≤n≤6.

(11)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page11of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

5. Acknowledgements

The authors want to thank Rudolf Ahlswede for useful discussions and pointing our attention to earlier work on the continuous and the discrete Entropy Power Inequalities.

(12)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page12of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

References

[1] C.E. SHANNON, A mathematical theory of communication, Bell Syst.

Tech. J., 27 (1948), pp. 379–423 and 623–656.

[2] A.J. STAM, Some inequalities satisfied by the quantities of information of Fisher and Shannon, Inform. Contr., 2 (1959), 101–112.

[3] N. M. BLACHMAN, The convolution inequality for entropy powers, IEEE Trans. Inform. Theory, IT-11 (1965), 267–271.

[4] M.H.M. COSTA, A new entropy power inequality, IEEE Trans. Inform.

Theory, 31 (1985), 751–760.

[5] A. DEMBO, Simple proof of the concavity of the entropy power with re- spect to added Gaussian noise, IEEE Trans. Inform. Theory, 35 (1989), 887–888.

[6] O. JOHNSON, A conditional entropy power inequality for dependent vari- ables, Statistical Laboratory Research Reports, 20 (2000), Cambridge University.

[7] A. WYNER AND J. ZIV, A theorem on the entropy of certain binary sequences and applications: Part I, IEEE Trans. Inform. Theory, IT-19 (1973), 769–772.

[8] A. WYNER, A theorem on the entropy of certain binary sequences and applications: Part II, IEEE Trans. Inform. Theory, IT-19 (1973), 772–777.

[9] H.S. WITSENHAUSEN, Entropy inequalities for discrete channels, IEEE Trans. Inform. Theory, IT-20 (1974), 610–616.

(13)

An Entropy Power Inequality for the Binomial Family

Peter Harremoës and Christophe Vignat

Title Page Contents

JJ II

J I

Go Back Close

Quit Page13of13

J. Ineq. Pure and Appl. Math. 4(5) Art. 93, 2003

http://jipam.vu.edu.au

[10] R. AHLSWEDE AND J. KÖRNER, On the connection between the en- tropies of input and output distributions of discrete memoryless channels, Proceedings of the Fifth Conference on Probability Theory, Brasov, Sept.

1974, 13–22, Editura Academiei Republicii Socialiste Romania, Bucuresti 1977.

[11] S. SHAMAI ANDA. WYNER, A binary analog to the entropy-power in- equality, IEEE Trans. Inform. Theory, IT-36 (1990), 1428–1430.

[12] F. TOPSØE, Information theoretical optimization techniques, Kyber- netika, 15(1) (1979), 8–27.

[13] F. TOPSØE, Some inequalities for information divergence and related measures of discrimination, IEEE Tr. Inform. Theory, IT-46(4) (2000), 1602–1609.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

In this paper, we prove the existence, uniqueness, and continuous dependence of the mild solutions for a class of fractional abstract differential equations with infinite delay..

In this paper, we prove the existence of nontrivial nonnegative classical time periodic solutions to the viscous diffusion equation with strongly nonlinear periodic sources..

Abstract: In this paper, we use the terminating case of the q-binomial formula, the q-Chu- Vandermonde formula and the Grüss inequality to drive an inequality about 3 φ 2..

Abstract: In this paper we prove some results which imply two conjectures proposed by Janous on an extension to the p-th power-mean of the Erdös–Debrunner inequality relating the

In this paper, using the methods of KKM-theory, see for example, Singh, Watson and Sri- vastava [17] and Yuan [20], we prove some results on simultaneous nonlinear inequalities..

Actually it can be seen that all the characterizations of A ≤ ∗ B listed in Theorem 2.1 have singular value analogies in the general case..

In this paper, it is shown that an extended Hardy-Hilbert’s integral inequality with weights can be established by introducing a power-exponent function of the form ax 1+x (a > 0,

In this paper, we give an approach for describing the uncertainty of the reconstructions in discrete tomography, and provide a method that can measure the information content of