• Nem Talált Eredményt

A CHARACTERIZATION OF THE UNIFORM DISTRIBUTION ON THE CIRCLE BY STAM INEQUALITY

N/A
N/A
Protected

Academic year: 2022

Ossza meg "A CHARACTERIZATION OF THE UNIFORM DISTRIBUTION ON THE CIRCLE BY STAM INEQUALITY"

Copied!
14
0
0

Teljes szövegt

(1)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page

Contents

JJ II

J I

Page1of 14 Go Back Full Screen

Close

A CHARACTERIZATION OF THE UNIFORM DISTRIBUTION ON THE CIRCLE BY STAM

INEQUALITY

PAOLO GIBILISCO DANIELE IMPARATO

Dipartimento SEFEMEQ, Facoltà di Economia Dipartimento di Matematica Università di Roma “Tor Vergata" Politecnico di Torino Via Columbia 2, 00133 Rome, Corso Duca degli Abruzzi 24,

Italy. 10129 Turin, Italy.

EMail:gibilisco@volterra.uniroma2.it EMail:daniele.imparato@polito.it

TOMMASO ISOLA

Dipartimento di Matematica Università di Roma “Tor Vergata"

Via della Ricerca Scientifica, 00133 Rome, Italy.

EMail:isola@mat.uniroma2.it

Received: 08 November, 2008

Accepted: 20 March, 2009

Communicated by: I. Pinelis 2000 AMS Sub. Class.: 62F11, 62B10.

Key words: Fisher information, Stam inequality.

Abstract: We prove a version of Stam inequality for random variables taking values on the circleS1. Furthermore we prove that equality occurs only for the uniform distribution.

(2)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page2of 14 Go Back Full Screen

Close

Contents

1 Introduction 3

2 Fisher Information and Stam Inequality onR 5

3 Stam Inequality onS1 6

4 Proof of the Main Result 7

(3)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page3of 14 Go Back Full Screen

Close

1. Introduction

It is well-known that the Gaussian, Poisson, Wigner and (discrete) uniform distribu- tions are maximum entropy distributions in the appropriate context (for example see [18,6,7]). On the other hand all the above quoted distributions can be characterized as those distributions giving equality in the Stam inequality. Let us describe what Stam inequality is about.

The Fisher informationIX of a real random variable (with strictly positive differ- entiable density functionf) is defined as

(1.1) IX :=

Z

(f0(x)/f(x))2f(x)dx.

ForX, Y independent random variables such that IX, IY < ∞, Stam was able to prove the inequality

(1.2) 1

IX+Y

≥ 1 IX

+ 1 IY

, where equality holds iffX,Y are Gaussian (see [16,1]).

It is difficult to overestimate the importance of the above result because of its links with other important results in analysis, probability, statistics, information the- ory, statistical mechanics and so on (see [2, 3, 9, 17]). Different proofs and deep generalizations of the theorem appear in the recent literature on the subject (see [19,13]).

A free analogue of Fisher information has been introduced in free probability.

Also in this case one can prove a Stam-like inequality. It is not surprising that the equality case characterizes the Wigner distribution that, in many respects, is the free analogue of the Gaussian distribution (see [18]).

(4)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page4of 14 Go Back Full Screen

Close

In the discrete setting, one can introduce appropriate versions of Fisher informa- tion and prove the Stam inequality. On the integers Z, equality characterizes the Poisson distribution, while on a finite group Gequality occurs for the uniform dis- tribution (see [8,15,10,11,12,14,4,5]).

In this short note we show that also on the circle S1 one can prove a version of the Stam inequality. This result is obtained by suitable modifications of the standard proofs. Moreover, equality occurs for the maximum entropy distribution, namely for the uniform distribution on the circle.

(5)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page5of 14 Go Back Full Screen

Close

2. Fisher Information and Stam Inequality on R

Let f : R → R be a differentiable, strictly positive density. One may define the f-scoreJf :R→Rby

Jf := f0 f.

Note thatJf isf-centered in the sense thatEf(Jf) = 0. In general, ifX : (Ω,F, p)→ Ris a random variable with densityf, we writeJX =Jf and

IX =Varf(Jf) =Ef[Jf2];

namely

(2.1) IX :=

Z

R

(f0(x)/f(x))2f(x)dx.

Let us suppose thatIX,IY <∞.

Theorem 2.1 ([16]). If X, Y : (Ω,F, p) → R are independent random variables then

(2.2) 1

IX+Y ≥ 1 IX + 1

IY , with equality if and only ifX,Y are Gaussian.

(6)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page6of 14 Go Back Full Screen

Close

3. Stam Inequality on S

1

We denote by S1 the circle group, namely the multiplicative subgroup of C\ {0}

defined as

S1 :={z ∈C:|z|= 1}.

We say that a function f : S1 → R has a tangential derivative in z ∈ S1 if the following limit exists and is finite

DTf(z) := lim

h→0

1 h

f(zeih)−f(z) .

From now on we consider functions f : S1 → R that are twice differentiable strictly positive densities.

Then, thef-score is defined as

Jf := DTf f ,

and isf-centered, in the sense thatEf(Jf) = 0, whereEf(g) :=R

S1gf dµ, andµis the normalized Haar measure onS1.

If X : (Ω,F, p) → S1 is a random variable with densityf, we writeJX = Jf and define the Fisher information as

IX :=Varf(Jf) = Ef[Jf2].

The main result of this paper is the proof of the following version of Stam in- equality on the circle.

Theorem 3.1. IfX, Y : (Ω,F, p)→S1are independent random variables then

(3.1) 1

IXY ≥ 1 IX + 1

IY, with equality if and only ifX orY are uniform.

(7)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page7of 14 Go Back Full Screen

Close

4. Proof of the Main Result

To prove our result we identify S1 with the interval [0,2π], where 0 and 2π are identified and the sum is modulo 2π. Any function f : [0,2π] → R, such that f(0) = f(2π), can be thought of as a function on S1. In this representation, the tangential derivative must be substituted by an ordinary derivative.

In this context, a density will be a nonnegative functionf : [0,2π]→Rsuch that 1

2π Z

0

f(θ)dθ = 1.

The uniform density is the function

f(θ) = 1, ∀θ ∈[0,2π].

From now on, we shall considerf belonging to the class P :=

f : [0,2π]→R

Z

0

f(θ)dθ = 2π, f >0 a.e.,

f ∈ C2(S1), f(k)(0) =f(k)(2π), k = 0,1,2

. Letf ∈ P; then

Z

0

f0(θ)dθ= 0 and therefore

Jf := f0 f isf-centered. Note thatJf(0) =Jf(2π).

If X : (Ω,F, p) → [0,2π] is a random variable with densityf ∈ P, from the scoreJX :=Jf it is possible to define the Fisher information

(8)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page8of 14 Go Back Full Screen

Close

IX :=Varf(Jf) = Ef[Jf2].

In this additive (modulo 2π) context the main result we want to prove takes the following (more traditional) form.

Theorem 4.1. IfX, Y : (Ω,F, p)→[0,2π]are independent random variables then

(4.1) 1

IX+Y ≥ 1 IX + 1

IY , with equality if and only ifX orY are uniform

Note that, since[0,2π]is compact, the conditionIX <∞always holds. However, we cannot ensure in general that IX 6= 0. In fact, it is easy to characterize this degenerate case.

Proposition 4.2. The following conditions are equivalent (i) X has uniform distribution;

(ii) IX = 0;

(iii) JX =constant.

Proof. (i) =⇒(ii)Obvious.

(ii) =⇒(iii)Obvious.

(iii) =⇒(i)LetJX(x) = β for everyx. ThenfX is the solution of the differential equation

fX0 (x)

fX(x) =β, f(0) =f(2π).

ThusfX(x) = ceβx and the symmetry condition impliesβ = 0, so thatfX is the uniform distribution.

(9)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page9of 14 Go Back Full Screen

Close

Proposition 4.3. LetX, Y : (Ω,F, p) → [0,2π]be independent random variables such that their densities belong toP. IfX (orY) has a uniform distribution then

1

IX+Y = 1 IX + 1

IY ,

in the sense that both sides of equality are equal to infinity.

Proof. Because of independence one has, by the convolution formula, that if X is uniform then so isX+Y and therefore we are done by Proposition4.2.

As a result of the above proposition, in what follows we consider random vari- ables with strictly positive Fisher information. Before the proof of the main result, we need the following lemma.

Lemma 4.4. LetX, Y : (Ω,F, p) → [0,2π]be two independent random variables with densitiesfX, fY ∈ P and letZ :=X+Y. Then

(4.2) JZ(Z) =Ep[JX(X)|Z] =Ep[JY(Y)|Z].

Proof. LetfZbe the density ofZ; namely, fZ(z) = 1

2π Z

0

fX(z−y)fY(y)dy, z ∈[0,2π], withfZ ∈ P. Then,

fZ0(z) = 1 2π

d dz

Z

0

fX(z−y)fY(y)dy

= 1 2π

Z

0

fY(y)fX0 (z−y)dy

=fX0 ∗fY(z).

(10)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page10of 14 Go Back Full Screen

Close

Therefore, givenz ∈[0,2π], JZ(z) = fZ0 (z)

fZ(z)

= 1 2π

Z

0

fX(x)fY(z−x) fZ(z)

fX0 (x) fX(x)dx

= 1 2π

Z

0

JX(x)fX|Z(x|z)dx

=EfX[JX|Z]

=Ep[JX(X)|Z].

Similarly, by symmetry of the convolution formula one can obtain JZ(z) = Ep[JY(Y)|Z], z ∈[0,2π], proving Lemma4.4.

We are ready to prove the main result.

Theorem 4.5. LetX, Y : (Ω,F, p)→[0,2π]be two independent random variables such thatIX, IY >0. Then

(4.3) 1

IX+Y

> 1 IX

+ 1 IY

.

Proof. Leta, b∈Rand letZ :=X+Y; then, by Lemma4.4

Ep[aJX(X) +bJY(Y)|Z] =aEp[JX(X)|Z] +bEp[JY(Y)|Z]

(4.4)

= (a+b)JZ(Z).

(11)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page11of 14 Go Back Full Screen

Close

Hence, applying Jensen’s inequality, we obtain

Ep[(aJX(X) +bJY(Y))2] =Ep[Ep[(aJX(X) +bJY(Y))2|Z]]

(4.5)

≥Ep[Ep[aJX(X) +bJY(Y)|Z]2]

=Ep[(a+b)2JZ(Z)2]

= (a+b)2IZ, and thus

(a+b)2IZ ≤Ep[(aJX(X) +bJY(Y))2]

=a2Ep[JX(X)2] + 2abEp[JX(X)JY(Y)] +b2Ep[JY(Y)2]

=a2IX +b2IY + 2abEp[JX(X)JY(Y)]

=a2IX +b2IY,

where the last equality follows from independence and since the score is a centered random variable.

Now, takea:= 1/IX andb:= 1/IY; then we obtain (4.6)

1 IX + 1

IY 2

IZ ≤ 1 IX + 1

IY.

It remains to be proved that equality cannot hold in (4.6). Definec:=a+b, where, again,a= 1/IX andb= 1/IY; then equality holds in (4.6) if and only if

(4.7) c2IZ =a2IX +b2IY. Let us prove that (4.7) is equivalent to

(4.8) aJX(X) +bJY(Y) =cJZ(X+Y) a.e.

(12)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page12of 14 Go Back Full Screen

Close

Indeed, letH :=aJX(X) +bJY(Y); then equality occurs in (4.5) if and only if Ep[H2|Z] = (Ep[H|Z])2, a.e.

i.e.

Ep[(H−Ep[H|Z])2|Z] = 0, a.e.

Therefore,H =Ep[H|Z]a.e., so that, by (4.4),

cJZ(Z) = Ep[aJX(X) +bJY(Y)|Z] =aJX(X) +bJY(Y) a.e.,

i.e. (4.8) is true. Conversely, if (4.8) holds, then by applying the squared power and taking the expectations we obtain (4.7).

Letx, y ∈[0,2π]; because of independence

fX,Y(x, y) = fX(x)·fY(y)6= 0.

Thus, it makes sense to write equality (4.8) forx, y ∈[0,2π]

(4.9) aJX(x) +bJY(y) =cJZ(x+y).

By deriving (4.9) with respect to both x and y and subtracting such relations one obtains

aJX0 (x) =bJY0 (y), ∀x, y ∈[0,2π], which impliesJX0 (x) =α =constant,i.e.

JX(x) = β+αx, x∈[0,2π].

In particular, by symmetry conditions one obtains

β =JX(0) =JX(2π) = β+ 2πα.

This implies thatα= 0, that is,JX =constant. By Proposition4.2one hasIX = 0.

This fact contradicts the hypotheses and ends the proof.

(13)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page13of 14 Go Back Full Screen

Close

References

[1] N.M. BLACHMAN, The convolution inequality for entropy powers, IEEE Trans. Inform. Theory, 11 (1965), 267–271.

[2] E. CARLEN, Superadditivity of Fisher’s information and logarithmic Sobolev inequalities, J. Funct. Anal., 101(1) (1991), 194–211.

[3] A. DEMBO, T. COVERANDJ. THOMAS, Information theoretic inequalities, IEEE Trans. Inform. Theory, 37(6) (1991), 1501–1518.

[4] P. GIBILISCO, D. IMPARATOANDT.ISOLA, Stam inequality onZn, Statis.

Probab. Lett., 78(13) (2008), 1851–1856.

[5] P. GIBILISCO AND T. ISOLA Fisher information and Stam inequality on a finite group, Bull. Lond. Math. Soc., 40(5) (2008,) 855–862.

[6] P. HARREMOES, Binomial and Poisson distribution as maximum entropy dis- tributions, IEEE Trans. Inform. Theory, 47(5) (2001), 2039–2041.

[7] O.T. JOHNSON, Log-concavity and the maximum entropy property of the Poisson distribution, Stoch. Proc. Appl., 117(6) (2007), 791–802.

[8] I.M. JOHNSTONE AND B. MACGIBBON, Une mesure d’information car- actérisant la loi de Poisson, in Séminaire de Probabilités, XXI, vol. 1247 of Lecture Notes in Math., 563–573, Springer, Berlin, 1987.

[9] A. KAGAN ANDZ. LANDSMAN Statistical meaning of Carlen’s superaddi- tivity of the Fisher information, Statis. Probab. Lett., 32 (1997), 175–179.

[10] A. KAGAN, A discrete version of Stam inequality and a characterization of the Poisson distribution, J. Statist. Plann. Inference, 92(1-2) (2001), 7–12.

(14)

Stam Inequality P. Gibilisco, D. Imparato

and T. Isola vol. 10, iss. 2, art. 34, 2009

Title Page Contents

JJ II

J I

Page14of 14 Go Back Full Screen

Close

[11] A. KAGAN, Letter to the editor: “A discrete version of Stam inequality and a characterization of the Poisson distribution" [J. Statist. Plann. Inference, 92(1- 2), (2001), 7–12], J. Statist. Plann. Inference, 99(1) (2001), 1.

[12] I. KONTOYANNIS, P. HARREMOËS AND O. JOHNSON, Entropy and the law of small numbers, IEEE Trans. Inform. Theory, 51(2) (2005), 466–472.

[13] M. MADIMAN ANDR.A. BARRON, Generalized entropy power inequalities and monotonicity properties of information, IEEE Trans. Inform. Theory, 53(7) (2007), 2317–2329.

[14] M. MADIMAN, O. JOHNSON ANDI. KONTOYANNIS, Fisher information, compound Poisson approximation and the Poisson channel, Proc. IEEE Intl.

Symp. Inform. Theory, Nice, France, 2007.

[15] V. PAPATHANASIOU, Some characteristic properties of the Fisher informa- tion matrix via Cacoullo-type inequalities, J. Multivariate Anal., 44(2) (1993), 256–265.

[16] A.J. STAM, Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control, 2 (1959), 101–112.

[17] C. VILLANI, Cercignani’s conjecture is sometimes true and always almost true. Comm. Math. Phys., 234(3)(2003), 455–490.

[18] D. VOICULESCU, The analogues of entropy and of Fisher’s information mea- sure in free probability theory. V. Noncommutative Hilbert transforms, Invent.

Math., 132(1) (1998), 189–227.

[19] R. ZAMIR, A proof of the Fisher information inequality via a data processing argument, IEEE Trans. Inform. Theory, 44(3) (1998), 1246–1250.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

Gaussian right focus uses a 2 dimensional Gaussian distribution as weights similar to the simple Gaussian, except that the center of the distribution is in the middle

is a random walk on the circle, and from classical results of probability theory it follows that the distribution of Z N converges weakly to the uniform distribution.. We determine

In the case of a-acyl compounds with a high enol content, the band due to the acyl C = 0 group disappears, while the position of the lactone carbonyl band is shifted to

N to the partition function is not only cumbersome, in that it does not change the probability distribution of the system in question; but it is also erroneous: in that it is

Lady Macbeth is Shakespeare's most uncontrolled and uncontrollable transvestite hero ine, changing her gender with astonishing rapiditv - a protean Mercury who (and

Furthermore, it is evident that in the case of relative recombination radiation intensity distribution measurements, the spatial resolution of the apparatus plays a

of the others, that is, where the temperature distribution is not unidimen- sional. A method of more universal validity is composed by the author [2], aiming

In this paper we presented our tool called 4D Ariadne, which is a static debugger based on static analysis and data dependen- cies of Object Oriented programs written in