• Nem Talált Eredményt

(1)http://jipam.vu.edu.au/ Volume 2, Issue 1, Article 5, 2001 FURTHER REVERSE RESULTS FOR JENSEN’S DISCRETE INEQUALITY AND APPLICATIONS IN INFORMATION THEORY I

N/A
N/A
Protected

Academic year: 2022

Ossza meg "(1)http://jipam.vu.edu.au/ Volume 2, Issue 1, Article 5, 2001 FURTHER REVERSE RESULTS FOR JENSEN’S DISCRETE INEQUALITY AND APPLICATIONS IN INFORMATION THEORY I"

Copied!
14
0
0

Teljes szövegt

(1)

http://jipam.vu.edu.au/

Volume 2, Issue 1, Article 5, 2001

FURTHER REVERSE RESULTS FOR JENSEN’S DISCRETE INEQUALITY AND APPLICATIONS IN INFORMATION THEORY

I. BUDIMIR, S.S. DRAGOMIR, AND J. PE ˇCARI ´C

DEPARTMENT OFMATHEMATICS, FACULTY OFTEXTILETECHNOLOGY, UNIVERSITY OFZAGREB, CROATIA.

SCHOOL OFCOMMUNICATIONS ANDINFOMATICS, VICTORIAUNIVERSITY OFTECHNOLOGY, P.O. BOX

14428, MELBOURNECITYMC, VICTORIA8001, AUSTRALIA. sever.dragomir@vu.edu.au

URL:http://rgmia.vu.edu.au/SSDragomirWeb.html

DEPARTMENT OFAPPLIEDMATHEMATICS, UNIVERSITY OFADELAIDE, ADELAIDE, 5001.

URL:http://mahazu.hazu.hr/DepMPCS/indexJP.html

Received 12 April, 2000; accepted 6 October, 2000 Communicated by L.-E. Persson

ABSTRACT. Some new inequalities which counterpart Jensen’s discrete inequality and improve the recent results from [4] and [5] are given. A related result for generalized means is estab- lished. Applications in Information Theory are also provided.

Key words and phrases: Convex functions, Jensen’s Inequality, Entropy Mappings.

2000 Mathematics Subject Classification. 26D15, 94Xxx.

1. INTRODUCTION

Let f : X → R be a convex mapping defined on the linear space X and xi ∈ X, pi ≥ 0 (i= 1, ..., m)withPm :=Pm

i=1pi >0.

The following inequality is well known in the literature as Jensen’s inequality

(1.1) f 1

Pm

m

X

i=1

pixi

!

≤ 1 Pm

m

X

i=1

pif(xi).

There are many well known inequalities which are particular cases of Jensen’s inequality, such as the weighted arithmetic mean-geometric mean-harmonic mean inequality, the Ky-Fan in- equality, the Hölder inequality, etc. For a comprehensive list of recent results on Jensen’s in- equality, see the book [25] and the papers [9]-[15] where further references are given.

In 1994, Dragomir and Ionescu [18] proved the following inequality which counterparts (1.1) for real mappings of a real variable.

ISSN (electronic): 1443-5756

c 2001 Victoria University. All rights reserved.

007-00

(2)

Theorem 1.1. Letf : I ⊆ R→Rbe a differentiable convex mapping on

I (

I is the interior ofI),xi

I, pi ≥0 (i= 1, ..., n)andPn

i=1pi = 1. Then we have the inequality

0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (1.2)

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi

n

X

i=1

pif0(xi),

wheref0 is the derivative off on

I.

Using this result and the discrete version of the Grüss inequality for weighted sums, S.S.

Dragomir obtained the following simple counterpart of Jensen’s inequality [5]:

Theorem 1.2. With the above assumptions forfand ifm, M ∈Iandm ≤xi ≤M(i= 1, ..., n), then we have

(1.3) 0≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

!

≤ 1

4(M −m) (f0(M)−f0(m)).

This was subsequently applied in Information Theory for Shannon’s and Rényi’s entropy.

In this paper we point out some other counterparts of Jensen’s inequality that are similar to (1.3), some of which are better than the above inequalities.

2. SOMENEWCOUNTERPARTS FORJENSENS DISCRETE INEQUALITY

The following result holds.

Theorem 2.1. Letf : I ⊆ R→R be a differentiable convex mapping on

I and xi ∈I with x1 ≤x2 ≤ · · · ≤xnandpi ≥0 (i= 1, ..., n)withPn

i=1pi = 1. Then we have

0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (2.1)

≤ (xn−x1) (f0(xn)−f0(x1)) max

1≤k≤n−1

Pkk+1

≤ 1

4(xn−x1) (f0(xn)−f0(x1)), wherePk :=Pk

i=1pi andk+1 := 1−Pk.

Proof. We use the following Grüss type inequality due to J. E. Peˇcari´c (see for example [25]):

(2.2)

1 Qn

n

X

i=1

qiaibi− 1 Qn

n

X

i=1

qiai· 1 Qn

n

X

i=1

qibi

≤ |an−a1| |bn−b1| max

1≤k≤n−1

Qkk+1 Q2n

,

provided thata, bare two monotonicn−tuples,q is a positive one,Qn :=Pn

i=1qi > 0, Qk :=

Pk

i=1qi andQ¯k+1 =Qn−Qk+1.

If in (2.2) we chooseqi =pi,ai =xi,bi =f0(xi)(andai, biwill be monotonic nondecreasing),

(3)

then we may state that (2.3)

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi n

X

i=1

pif0(xi)

≤(xn−x1) (f0(xn)−f0(x1)) max

1≤k≤n−1

Pkk+1 .

Now, using (1.2) and (2.3) we obtain the first inequality in (2.1).

For the second inequality, we observe that Pkk+1 =Pk(1−Pk)≤ 1

4(Pk+ 1−Pk)2 = 1 4 for allk ∈ {1, ..., n−1}and then

1≤k≤n−1max

Pkk+1 ≤ 1 4,

which proves the last part of (2.1).

Remark 2.2. It is obvious that the inequality (2.1) is an improvement of (1.3) if we assume that the order forxi is as in the statement of Theorem 2.1.

Another result is embodied in the following theorem.

Theorem 2.3. Letf :I ⊆R→Rbe a differentiable convex mapping on

I andm, M ∈I with m ≤ xi ≤ M (i= 1, ..., n)andpi ≥ 0 (i= 1, ..., n)withPn

i=1pi = 1. IfS is a subset of the set{1, ..., n}minimizing the expression

(2.4)

X

i∈S

pi− 1 2 ,

then we have the inequality 0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (2.5)

≤ Q(M −m) (f0(M)−f0(m))≤ 1

4(M −m) (f0(M)−f0(m)), where

Q=X

i∈S

pi 1−X

i∈S

pi

! .

Proof. We use the following Grüss type inequality due the Andrica and Badea [2]:

(2.6)

Qn

n

X

i=1

qiaibi

n

X

i=1

qiai·

n

X

i=1

qibi

≤(M1−m1) (M2−m2)X

i∈S

qi Qn−X

i∈S

qi

!

provided thatm1 ≤ ai ≤ M1,m2 ≤ bi ≤ M2 fori = 1, ..., n, andS is the subset of{1, ..., n}

which minimises the expression

X

i∈S

qi−1 2Qn

.

(4)

Choosingqi =pi,ai =xi, bi =f0(xi), then we may state that 0 ≤

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi n

X

i=1

pif0(xi) (2.7)

≤ (M −m) (f0(M)−f0(m))X

i∈S

pi 1−X

i∈S

pi

! .

Now, using (1.2) and (2.7), we obtain the first inequality in (2.5). For the last part, we observe that

Q≤ 1 4

X

i∈S

pi+ 1−X

i∈S

pi

!2

= 1 4

and the theorem is thus proved.

The following inequality is well known in the literature as the arithmetic mean-geometric mean-harmonic-mean inequality:

(2.8) An(p, x)≥Gn(p, x)≥Hn(p, x), where

An(p, x) : =

n

X

i=1

pixi - the arithmetic mean,

Gn(p, x) : =

n

Y

i=1

xpii - the geometric mean, Hn(p, x) : = 1

n

P

i=1 pi

xi

- the harmonic mean,

andPn

i=1pi = 1 pi ≥0,i= 1, n .

Using the above two theorems, we are able to point out the following reverse of the AGH - inequality.

Proposition 2.4. Letxi >0 (i= 1, ..., n)andpi ≥0withPn

i=1pi = 1.

(i) Ifx1 ≤x2 ≤ · · · ≤xn−1 ≤xn, then we have 1 ≤ An(p, x)

Gn(p, x) (2.9)

≤ exp

"

(xn−x1)2

x1xn max

1≤k≤n−1

Pkk+1

#

≤ exp

"

1

4· (xn−x1)2 x1xn

# .

(ii) If the set S ⊆ {1, ..., n} minimizes the expression (2.4), and0 < m ≤ xi ≤ M < ∞ (i= 1, ..., n), then

(2.10) 1≤ An(p, x)

Gn(p, x) ≤exp

"

Q· (M−m)2 mM

#

≤exp

"

1

4· (M −m)2 mM

# .

The proof goes by the inequalities (2.1) and (2.5), choosingf(x) = −lnx. A similar result can be stated forGnandHn.

(5)

Proposition 2.5. Letp≥1andxi >0,pi ≥0 (i= 1, ..., n)withPn

i=1pi = 1.

(i) Ifx1 ≤x2 ≤ · · · ≤xn−1 ≤xn, then we have 0 ≤

n

X

i=1

pixpi

n

X

i=1

pixi

!p

(2.11)

≤ p(xn−x1) xp−1n −xp−11

1≤k≤n−1max

Pkk+1

≤ p

4(xn−x1) xp−1n −xp−11 .

(ii) If the set S ⊆ {1, ..., n} minimizes the expression (2.4), and0 < m ≤ xi ≤ M < ∞ (i= 1, ..., n), then

0 ≤

n

X

i=1

pixpi

n

X

i=1

pixi

!p

(2.12)

≤ pQ(M −m) Mp−1−mp−1

≤ 1

4p(M −m) Mp−1−mp−1 .

Remark 2.6. The above results are improvements of the corresponding inequalities obtained in [5].

Remark 2.7. Similar inequalities can be stated if we choose other convex functions such as:

f(x) = xlnx,x >0orf(x) = exp (x),x∈R. We omit the details.

3. A CONVERSEINEQUALITY FOR CONVEX MAPPINGS DEFINED ONRn

In 1996, Dragomir and Goh [15] proved the following converse of Jensen’s inequality for convex mappings onRn.

Theorem 3.1. Letf :Rn→Rbe a differentiable convex mapping onRnand (∇f) (x) :=

∂f(x)

∂x1 , ...,∂f(x)

∂xn

, the vector of the partial derivatives,x= (x1, ..., xn)∈Rn. Ifxi ∈Rm (i= 1, ..., m),pi ≥0, i= 1, ..., m,withPm :=Pm

i=1pi >0, then

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.1)

≤ 1 Pm

m

X

i=1

pih∇f(xi), xii −

* 1 Pm

m

X

i=1

pi∇f(xi), 1 Pm

m

X

i=1

pixi +

.

The result was applied to different problems in Information Theory by providing different coun- terpart inequalities for Shannon’s entropy, conditional entropy, mutual information, conditional mutual information, etc.

For generalizations of (3.1) in Normed Spaces and other applications in Information Theory, see Mati´c’s Ph.D dissertation [23].

Recently, Dragomir [4] provided an upper bound for Jensen’s difference

(3.2) ∆ (f, p, x) := 1

Pm m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! ,

(6)

which, even though it is not as sharp as (3.1), provides a simpler way, and for applications, a better way, of estimating the Jensen’s differences ∆. His result is embodied in the following theorem.

Theorem 3.2. Letf :Rn →Rbe a differentiable convex mapping andxi ∈Rn,i = 1, ..., m.

Suppose that there exists the vectorsφ,Φ∈Rnsuch that

(3.3) φ≤xi ≤Φ (the order is considered on the co-ordinates) andm, M ∈Rnare such that

(3.4) m≤ ∇f(xi)≤M

for alli∈ {1, ..., m}. Then for allpi ≥0 (i= 1, ..., m)withPm >0, we have the inequality

(3.5) 0≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ 1

4kΦ−φk kM −mk, wherek·kis the usual Euclidean norm onRn.

He applied this inequality to obtain different upper bounds for Shannon’s and Rényi’s en- tropies.

In this section, we point out another counterpart for Jensen’s difference, assuming that the

∇−operator is of Hölder’s type, as follows.

Theorem 3.3. Let f : Rn → R be a differentiable convex mapping and xi ∈ Rn, pi ≥ 0 (i= 1, ..., m)withPm >0. Suppose that the∇−operator satisfies a condition ofr−H−Hölder type, i.e.,

(3.6) k∇f(x)− ∇f(y)k ≤Hkx−ykr, for allx, y ∈Rn, whereH >0,r∈(0,1]andk·kis the Euclidean norm.

Then we have the inequality:

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.7)

≤ H Pm2

X

1≤i<j≤m

pipjkxi−xjkr+1.

Proof. We recall Korkine’s identity, 1

Pm

m

X

i=1

pihyi, xii−

* 1 Pm

m

X

i=1

piyi, 1 Pm

m

X

i=1

pixi +

= 1 2Pm2

n

X

i,j=1

pipjhyi−yj, xi−xji, x, y ∈Rn,

and simply write 1

Pm m

X

i=1

pih∇f(xi), xii −

* 1 Pm

m

X

i=1

pi∇f(xi), 1 Pm

m

X

i=1

pixi +

= 1 2Pm2

n

X

i,j=1

pipjh∇f(xi)− ∇f(xj), xi−xji.

(7)

Using (3.1) and the properties of the modulus, we have

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ 1 2Pm2

m

X

i,j=1

pipj|h∇f(xi)− ∇f(xj), xi−xji|

≤ 1 2Pm2

m

X

i,j=1

pipjk∇f(xi)− ∇f(xj)k kxi −xjk

≤ H Pm2

m

X

i,j=1

pipjkxi−xjkr+1

and the inequality (3.7) is proved.

Corollary 3.4. With the assumptions of Theorem 3.3 and if∆ = max1≤i<j≤mkxi−xjk, then we have the inequality

(3.8) 0≤ 1

Pm m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ H∆r+1 2Pm2 1−

m

X

i=1

p2i

! . Proof. Indeed, as

X

1≤i<j≤m

pipjkxi−xjkr+1 ≤∆r+1 X

1≤i<j≤m

pipj. However,

X

1≤i<j≤m

pipj = 1 2

m

X

i,j=1

pipj−X

i=j

pipj

!

= 1 2 1−

m

X

i=1

p2i

! ,

and the inequality (3.8) is proved.

The case of Lipschitzian mappings is embodied in the following corollary.

Corollary 3.5. Let f : Rn → R be a differentiable convex mapping and xi ∈ Rn, pi ≥ 0 (i= 1, ..., n) with Pm > 0. Suppose that the ∇−operator is Lipschitzian with the constant L >0, i.e.,

(3.9) k∇f(x)− ∇f(y)k ≤Lkx−yk, for allx, y ∈Rn, wherek·kis the Euclidean norm. Then

0 ≤ 1

Pm m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.10)

≤ L

 1 Pm

m

X

i=1

pikxik2

1 Pm

m

X

i=1

pixi

2

.

Proof. The argument is obvious by Theorem 3.3, taking into account that forr = 1, X

1≤i<j≤m

pipjkxi−xjk2 =Pm

m

X

i=1

pikxik2

m

X

i=1

pixi

2

,

andk·kis the Euclidean norm.

(8)

Moreover, if we assume more about the vectors(xi)i=1,n, we can obtain a simpler result that is similar to the one in [4].

Corollary 3.6. Assume thatf is as in Corollary 3.5. If

(3.11) φ≤xi ≤Φ (on the co-ordinates),φ,Φ∈Rn (i= 1, .., m), then we have the inequality

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.12)

≤ 1

4 ·L· kΦ−φk2.

Proof. It follows by the fact that inRn, we have the following Grüss type inequality (as proved in [4])

(3.13) 1

Pm

m

X

i=1

pikxik2

1 Pm

m

X

i=1

pixi

2

≤ 1

4kΦ−φk2,

provided that (3.11) holds.

Remark 3.7. For some Grüss type inequalities in Inner Product Spaces, see [7].

4. SOME RELATEDRESULTS

Start with the following definitions from [3].

Definition 4.1. Let−∞ < a < b < ∞. ThenCM[a, b]denotes the set of all functions with domain[a, b]that are continuous and strictly monotonic there.

Definition 4.2. Let −∞ < a < b < ∞, and let f ∈ CM[a, b]. Then, for each positive integern, eachn−tuplex = (x1, ..., xn),wherea ≤xj ≤ b(j = 1,2, ..., n), and eachn-tuple p = (p1, p2, ..., pn),where pj > 0 (j = 1,2, ..., n) andPn

j=1pj = 1, letMf(x, y)denote the (weighted) mean

f−1 ( n

X

j=1

pjf(xj) )

.

We may state now the following result.

Theorem 4.1. LetSbe the subset of{1, ..., n}which minimizes the expression

P

i∈Spi−1/2 . Iff, g ∈CM[a, b], then

sup

x

{|Mf(x, p)−Mg(x, p)|} ≤Q·

f−10 ·

f ◦g−100

· |g(b)−g(a)|2, provided that the right-hand side of the inequality is finite, where, as above,

Q= X

i∈S

pi

!

1−X

i∈S

pi

! , andk·kis the usual sup-norm.

Proof. Let, as in [3],h=f ◦g−1,n >1,

x= (x1, x2, ..., xn) andp= (p1, p2, ..., pn)

(9)

be as in the Definition 4.2, andyj = g(xj) (j = 1,2, ..., n). By the mean-value theorem, for someαin the open interval joiningf(a)tof(b), we have

Mf(x, p)−Mg(x, p) = f−1 ( n

X

j=1

pjf(xj) )

−f−1

"

h ( n

X

j=1

pjg(xj) )#

= f−10

(α)

" n X

j=1

pjf(xj)−h ( n

X

j=1

pjg(xj) )#

= f−10

(α)

" n X

j=1

pjh(yj)−h ( n

X

j=1

pjyj )#

= f−10

(α)

" n X

j=1

pj (

h(yj)−h

n

X

k=1

pkyk

!)#

.

Using the mean-value theorem a second time, we conclude that there exists pointsz1, z2, ..., zn in the open interval joiningg(a)tog(b), such that

Mf(x, p)−Mg(x, p) = f−10

(α)

p1{(1−p1)y1−p2y2− · · · −pnyn}h0(z1) +p2{−p1y1+ (1−p2)y2− · · · −pnyn}h0(z2)

+· · ·

+pn{−p1y1−p2y2− · · ·+ (1−pn)yn}h0(zn)

= f−10

(α)

p1{p2(y1−y2) +· · ·+pn(y1−yn)}h0(z1) +p2{p1(y2−y1) +· · ·+pn(y2 −yn)}h0(z2)

+· · ·

+pn{p1(yn−y1) +· · ·+pn−1(yn−yn−1)}h0(zn)

= f−10

(α) X

1≤i<j≤n

pipj(yi−yj){h0(zi)−h0(zj)}.

Using the mean value theorem a third time, we conclude that there exists pointsωij(1≤i < j ≤n) in the open interval joiningg(a)tog(b), such that

f−10

(α) X

1≤i<j≤n

pipj(yi−yj){h0(zi)−h0(zj)}

= f−10

(α) X

1≤i<j≤n

pipj(yi−yj) (zi−zj)h00ij).

Consequently,

|Mf(x, p)−Mg(x, p)| ≤ f−10

(α)

X

1≤i<j≤n

pipj|yi−yj| · |zi−zj| · |h00ij)|

f−10

· kh00k· X

1≤i<j≤n

pipj|yi−yj| · |zi−zj|

(10)

≤ (by the Cauchy-Buniakowski-Schwartz inequality)

f−10 ·

f◦g−100 ·

s X

1≤i<j≤n

pipj|yi−yj|2· s

X

1≤i<j≤n

pipj|zi−zj|2

≤ (by the Andrica and Badea result)

f−10 ·

f◦g−100 ·

v u u t

X

i∈S

pi

!

1−X

i∈S

pi

!

|g(b)−g(a)|2

· v u u t

X

i∈S

pi

!

1−X

i∈S

pi

!

|g(b)−g(a)|2

= Q

f−10 ·

f◦g−100

· |g(b)−g(a)|2,

and the theorem is proved.

Corollary 4.2. Iff, g∈CM[a, b], then

sup

x

{|Mf (x, p)−Mg(x, p)|} ≤Q·

1 f0

·

1 g0

f0 g0

0

· |g(b)−g(a)|2, provided that the right hand side of the inequality exists.

Proof. This follows at once from the fact that f−10

= 1

f0◦f−1 and

f◦g−100

= (g0 ◦g−1) (f00◦g−1)−(f0◦g−1) (g00◦g−1) (g0◦g−1)3 =

1 g0

f0 g0

0

◦g−1.

Remark 4.3. This establishes Theorem 4.3 from [3] and replaces the multiplicative factor 14 byQ. In Corollary 4.2, we also replaced the multiplicative factor 14 byQ.

5. APPLICATIONS ININFORMATIONTHEORY

We give some new applications for Shannon’s entropy Hb(X) :=

r

X

i=1

pilogb 1 pi,

whereX is a random variable with the probability distribution(pi)i=1,r.

Theorem 5.1. LetX be as above and assume thatp1 ≥p2 ≥ · · · ≥pr orp1 ≤ p2 ≤ · · · ≤pr. Then we have the inequality

(5.1) 0≤logbr−Hb(X)≤ (p1−pr)2 p1pr max

1≤k≤r

Pkk+1 . Proof. We choose in Theorem 2.1,f(x) = −logbx, x > 0, xi = p1

i (i= 1, ..., r). Then we havex1 ≤x2 ≤ · · · ≤xr and by (2.1) we obtain

0≤logbr−Hb(X)≤ 1

pr

− 1 p1

1

p1

r

+ 1

1 p1

!

1≤k≤rmax

Pkk+1 ,

(11)

which is equivalent to (5.1). The same inequality is obtained ifp1 ≤p2 ≤ · · · ≤pr. Theorem 5.2. LetXbe as above and suppose that

pM : = max{pi|i= 1, ..., r}, pm : = min{pi|i= 1, ..., r}. IfSis a subset of the set{1, ..., r}minimizing the expression

P

i∈Spi−1/2

, then we have the estimation

(5.2) 0≤logbr−Hb(X)≤Q· (pM −pm)2 lnb·pMpm

. Proof. We shall choose in Theorem 2.3,

f(x) = −logbx, x > 0, xi = 1

pi i= 1, r . Thenm = p1

M,M = p1

m,f0(x) =−xln1b and the inequality (2.3) becomes:

0 ≤ logbr−

r

X

i=1

pilogb 1 pi

≤ Q 1 lnb

1 pm

− 1 pM

− 1

1 pm

+ 1

1 pM

!

= Q· 1

lnb ·(pM −pm)2 pMpm ,

hence the estimation (5.2) is proved.

Consider the Shannon entropy

(5.3) H(X) := He(X) =

r

X

i=1

piln 1 pi and Rényi’s entropy of orderα(α∈(0,∞)\ {1})

(5.4) H[α](X) := 1

1−αln

r

X

i=1

pαi

! .

Using the classical Jensen’s discrete inequality for convex mappings, i.e.,

(5.5) f

r

X

i=1

pixi

!

r

X

i=1

pif(xi),

where f : I ⊆ R→R is a convex mapping on I, xi ∈ I (i= 1, ..., r) and (pi)i=1,r is a probability distribution, for the convex mappingf(x) =−lnx,we have

(5.6) ln

r

X

i=1

pixi

!

r

X

i=1

pilnxi.

Choosexi =pα−1i (i= 1, ..., r)in (5.6) to obtain ln

r

X

i=1

pαi

!

≥(α−1)

r

X

i=1

pilnpi,

which is equivalent to

(1−α)

H[α](X)−H(X)

≥0.

(12)

Now, ifα ∈ (0,1),then H[α](X) ≤ H(X), and ifα > 1then H[α](X) ≥ H(X). Equal- ity holds iff (pi)i=1,r is a uniform distribution and this fact follows by the strict convexity of −ln (·). This inequality also follows as a special case of the following well known fact:

H[α](X)is a nondecreasing function of α. See for example [26] or [22].

Theorem 5.3. Under the above assumptions, given that pm = mini=1,rpi, pM = maxi=1,rpi, then we have the inequality

(5.7) 0≤(1−α)

H[α](X)−H(X)

≤Q· pα−1M −pα−1m 2

pα−1M pα−1m , for allα∈(0,1)∪(1,∞).

Proof. Ifα∈(0,1), then

xi :=pα−1i

pα−1M , pα−1m and ifα∈(1,∞), then

xi =pα−1i

pα−1m , pα−1M

, fori∈ {1, ..., n}.

Applying Theorem 2.3 forxi :=pα−1i andf(x) =−lnx, and taking into account thatf0(x) =

1x, we obtain (1−α)

H[α](X)−H(X)





Q pα−1m −pα−1M

1

pα−1m + 1

pα−1M

if α ∈(0,1), Q pα−1M −pα−1m

1

pα−1M + 1

pα−1m

if α ∈(1,∞)

=









Q· (pα−1m −pα−1M )2

pα−1m pα−1M if α∈(0,1), Q· (pα−1M −pα−1m )2

pα−1M pα−1m if α∈(1,∞)

=Q· pα−1M −pα−1m 2

pα−1M pα−1m

for allα∈(0,1)∪(1,∞)and the theorem is proved.

Using a similar argument to the one in Theorem 5.3, we can state the following direct appli- cation of Theorem 2.3.

Theorem 5.4. Let(pi)i=1,r be as in Theorem 5.3. Then we have the inequality (5.8) 0≤(1−α)H[α](X)−lnr−αlnGr(p)≤Q· pα−1M −pα−1m 2

PMα−1pα−1m , for allα∈(0,1)∪(1,∞).

Remark 5.5. The above results improve the corresponding results from [5] and [4] with the constantQwhich is less than 14.

Acknowledgement 1. The authors would like to thank the anonymous referee for valuable comments and for the references [26] and [22].

(13)

REFERENCES

[1] A. RÉNYI, On measures of entropy and information, Proc. Fourth Berkley Symp. Math. Statist.

Prob., 1 (1961), 547–561, Univ. of California Press, Berkley.

[2] D. ANDRICAANDC. BADEA, Grüss’ inequality for positive linear functionals, Periodica Math.

Hung., 19(2) (1988), 155–167.

[3] G.T. CARGOAND O. SHISHA, A metric space connected with general means, J. Approx. Th., 2 (1969), 207–222.

[4] S.S. DRAGOMIR, A converse of the Jensen inequality for convex mappings of several variables and applications. (Electronic Preprint:

http://matilda.vu.edu.au/~rgmia/InfTheory/ConverseJensen.dvi) [5] S.S. DRAGOMIR, A converse result for Jensen’s discrete inequality via Grüss’ inequality and

applications in information theory, Analele Univ. Oradea, Fasc. Math., 7 (1999-2000), 178–189.

[6] S.S. DRAGOMIR, A further improvement of Jensen’s inequality, Tamkang J. Math., 25(1) (1994), 29–36.

[7] S.S. DRAGOMIR, A generalisation of Grüss’s inequality in inner product spaces and applications, J. Math. Anal. Appl., 237 (1999), 74–82.

[8] S.S. DRAGOMIR, A new improvement of Jensen’s inequality, Indian J. Pure Appl. Math., 26(10) (1995), 959–968.

[9] S.S. DRAGOMIR, An improvement of Jensen’s inequality, Bull. Math. Soc. Sci. Math. Romania., 34 (1990), 291–296.

[10] S.S. DRAGOMIR, On some refinements of Jensen’s inequality and applications, Utilitas Math., 43 (1993), 235–243.

[11] S.S. DRAGOMIR, Some refinements of Jensen’s inequality, J. Math. Anal. Appl., 168 (1992), 518–

522.

[12] S.S. DRAGOMIR, Some refinements of Ky Fan’s inequality, J. Math. Anal. Appl., 163 (1992), 317–321.

[13] S.S. DRAGOMIR, Two mappings in connection with Jensen’s inequality, Extracta Math., 8(2-3) (1993), 102–105.

[14] S.S. DRAGOMIRANDS. FITZPARTICK,s- Orlicz convex functions in linear spaces and Jensen’s discrete inequality, J. Math. Anal. Appl., 210 (1997), 419–439.

[15] S.S. DRAGOMIRANDC.J. GOH, A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in Information Theory, Math. Comput. Modelling, 24(2) (1996), 1–11.

[16] S.S. DRAGOMIRANDC.J. GOH, Some bounds on entropy measures in information theory, Appl.

Math. Lett., 10(3) (1997), 23–28.

[17] S.S. DRAGOMIRANDC.J. GOH, Some counterpart inequalities for a functional associated with Jensen’s inequality, J. Inequal. Appl., 1 (1997), 311–325.

[18] S.S. DRAGOMIRANDN.M. IONESCU, Some Converse of Jensen’s inequality and applications, Anal. Num. Theor. Approx. (Cluj-Napoca), 23 (1994), 71–78.

[19] S.S. DRAGOMIR, C.E.M. PEARCEANDJ. E. PE ˇCARI ´C, New inequalities for logarithmic map and their applications for entropy and mutual information, Kyungpook Math. J., in press.

[20] S.S. DRAGOMIR, C.E.M. PEARCEANDJ. E. PE ˇCARI ´C, On Jensen’s and related inequalities for isotonic sublinear functionals, Acta Sci. Math., (Szeged), 61 (1995), 373–382.

(14)

[21] S.S. DRAGOMIR, J. E. PE ˇCARI ´CANDL.E. PERSSON, Properties of some functionals related to Jensen’s inequality, Acta Math. Hungarica, 20(1-2) (1998), 129–143.

[22] T. KOSKIANDL.E. PERSSON, Some properties of generalized entropy with applications to com- pression of data, J. of Int. Sciences, 62 (1992), 103–132.

[23] M. MATI ´C, Jensen’s Inequality and Applications in Information Theory, (in Croatian), Ph.D. Dis- sertation, Split, Croatia, 1998.

[24] R.J. McELIECE, The Theory of Information and Coding, Addison Wesley Publishing Company, Reading, 1977.

[25] J. E. PE ˇCARI ´C, F. PROSCHANANDY.L. TONG, Convex Functions, Partial Orderings and Sta- tistical Applications, Academic Press, 1992.

[26] J. PEETREANDL.E. PERSSON, A general Beckenbach’s inequality with applications, In: Func- tion Spaces, Differential Operators and Nonlinear Analysis, Pitman Research Notes in Math., 211 (1989), 125–139.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

FEDOTOV, A Grüss type inequality for map- ping of bounded variation and applications to numerical analysis integral and applications for special means, RGMIA Res. DRAGOMIR,

FEDOTOV, A Grüss type inequality for mapping of bounded variation and applications to numerical analysis integral and applications for special means, RGMIA Res.. DRAGOMIR,

DRAGOMIR, A Grüss type integral inequality for mappings of r- Hölder’s type and applications for trapezoid formula, Tamkang J. DRAGOMIR, New estimation of the remainder in

DRAGOMIR, On the Ostrowski’s integral inequality for mappings with bounded variation and applications, Math.. DRAGOMIR, Ostrowski’s Inequality for Monotonous Mappings and

ROUMELIOTIS, An inequality of Ostrowski type for mappings whose second derivatives are bounded and applications, East Asian Math.. ROUMELIOTIS, An inequality of Ostrowski- Grüss

[3] BAI-NI GUO AND FENG QI, Inequalities for generalized weighted mean values of convex function, Math.. PE ˇ CARI ´ C, On the Jensen

Pogány in [6], by avoiding the assumption of differentiability used in [7, 8, 9], and instead using the inequalities due to Hölder, Nehari (Lemma 2.4) and Barnes, Godunova and

Applications for discrete and integral inequalities including the Heisen- berg inequality for vector-valued functions in Hilbert spaces are provided.. 2000 Mathematics