• Nem Talált Eredményt

Lower and upper bounds in terms of a non-additive generalized measure of ‘useful’ information are obtained

N/A
N/A
Protected

Academic year: 2022

Ossza meg "Lower and upper bounds in terms of a non-additive generalized measure of ‘useful’ information are obtained"

Copied!
6
0
0

Teljes szövegt

(1)

CODING THEOREMS ON GENERALIZED COST MEASURE

RAYEES AHMAD DAR AND M.A.K BAIG DEPARTMENT OFSTATISTICS

UNIVERSITY OFKASHMIR

SRINAGAR-190006, INDIA. rayees_stats@yahoo.com

Received 6 June, 2007; accepted 13 December, 2007 Communicated by N.S. Barnett

ABSTRACT. In the present communication a generalized cost measure of utilities and lengths of output codewords from a memoryless source are defined. Lower and upper bounds in terms of a non-additive generalized measure of ‘useful’ information are obtained.

Key words and phrases: Decipherable code, Source alphabet, Codeword, Cost function, Hölders inequality, Codeword length.

2000 Mathematics Subject Classification. 94A17, 94A24.

1. INTRODUCTION

Let a finite set ofnsource symbolsX = (x1, x2, . . . , xn)with probabilitiesP = (p1, p2, . . . , pn) be encoded usingD(D≥2)code alphabets, then there is a uniquely decipherable/instantaneous code with lengthsl1, l2, . . . , lnif and only if

(1.1)

n

X

i=1

D−li ≤1.

(1.1) is known as the Kraft inequality [5]. IfL=Pn

i=1lipi is the average code word length, then for a code which satisfies (1.1), Shannon’s coding theorem for a noiseless channel (Fein- stein [5]) gives a lower bound ofLin terms of Shannon’s entropy [13]

(1.2) L≥H(P)

with equality iffli =−logpi ∀ i= 1,2, . . . , n.All the logarithms are to baseD.

Belis and Guiasu [2] observed that a source is not completely specified by the probability distributionP over the source symbolsX, in the absence of its qualitative character. It can also be assumed that the source letters or symbols are assigned weights according to their importance or utilities in the view of the experimenter.

366-07

(2)

LetU = (u1, u2, . . . , un)be the set of positive real numbers, whereui is the utility or impor- tance of outcomexi. The utitlity, in general, is independent of the probability of encoding of source symbol xi,i.e.,pi.The information source is thus given by

(1.3) S=

x1 x2 · · · xn

p1 p2 · · · pn

u1 u2 · · · un

where ui >0, pi ≥0, Pn

i=1pi = 1.

Belis and Guiasu [2] introduced the following qualitative -quantitative measure of information

(1.4) H(P;U) =−

n

X

i=1

uipilogpi

which is a measure of the average quantity of ‘valuable’ or ‘useful’ information provided by the information source (1.3). Guiasu and Picard [6] considered the problem of encoding the letter output of the source (1.3) by means of a single letter prefix code whose code words w1, w2, . . . , wn are of lengths l1, l2, . . . , ln respectively satisfying the Kraft inequality (1.1).

They introduced the following ‘useful’ mean length of the code

(1.5) L(P;U) =

Pn

i=1uipili Pn

i=1uipi .

Longo [11] interpreted (1.5) as the average transmission cost of the letterxiand obtained the following lower bound for the cost measure (1.5) as

(1.6) L(P;U)≥H(P;U),

where

H(P;U) = − Pn

i=1uipilogpi Pn

i=1uipi

is the ‘useful’ information measure due to Guiasu and Picard [6], which was also characterized by Bhaker and Hooda [3] by a mean value representation.

2. GENERALIZEDMEASURES OFCOST

In the derivation of the cost measure (1.5) it is assumed that the cost is a linear function of code length, but this is not always the case. There are occasions when the cost behaves like an exponential function of code word lengths. Such types occur frequently in market equilibrium and growth models in economics. Thus sometimes it might be more appropriate to choose a code which minimizes a monotonic function,

(2.1) C =

n

X

i=1

uβipβiD1−αα li,

whereα >0 (6= 1), β >0are the parameters related to the cost.

In order to make the result of the paper more comparable with the usual noiseless coding theorem, instead of minimizing (2.1) we minimize

(2.2) Lβα(U) = 1

21−α−1

"

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

−1

# ,

whereα >0 (6= 1), β >0,which is a monotonic function ofC. We define (2.2) as the ‘useful’

average code length of orderαand typeβ.

Clearly, ifα → 1, β = 1,(2.2) reduces to (1.5) which further reduces to the ordinary mean length given by Shannon [13] when ui = 1 ∀ i = 1,2, . . . , n.We also note that (2.2) is a

(3)

monotonic non-decreasing function ofαand if all theli,s are the same, sayli =lfor eachiand α→1,thenLβα(U) = l.This is an important property for any measure of length to possess.

In the next section, we derive the lower and upper bounds of the cost function (2.2) in terms of the following ‘useful’ information measure of orderαand typeβ,

(2.3) Hαβ(P;U) = 1

21−α−1

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi −1

# ,

whereα >0 (6= 1), β >0, pi ≥0, i= 1,2, . . . , n, Pn

i=1pi ≤1.

(i) Whenβ = 1, (2.3) reduces to the measure of ‘useful’ information proposed and char- acterised by Hooda and Ram [9].

(ii) Ifα→1, β= 1,(2.3) reduces to the measure given by Belis and Guiasu [2].

(iii) Ifα →1, β = 1andui = 1 ∀ i= 1,2, . . . , n,(2.3) reduces to the well known measure given by Shannon [13].

Also, we have used the condition (2.4)

n

X

i=1

uβipβ−1i D−li

n

X

i=1

uβipβi

to find the bounds. It may be seen that in the case whenβ = 1, ui = 1 ∀i = 1,2, . . . , n, (2.4) reduces to the Kraft inequality (1.1). D(D≥2)is the size of the code alphabet.

Longo [12], Gurdial and Pessoa [7], Autar and Khan [1], Jain and Tuteja [10], Taneja et al.

[16], Bhatia [4], Singh, Kumar and Tuteja [15] and Hooda and Bhaker [8] considered the prob- lem of a ‘useful’ information measure in the context of noiseless coding theorems for sources involving utilities.

In this paper, we study upper and lower bounds by considering a new function dependent on the parameter α and type β and a utility function. Our motivation for studying this new function is that it generalizes some entropy functions already existing in the literature. The function under study is closely related to Tsallis entropy which is used in Physics.

3. BOUNDS ON THEGENERALIZEDCOST MEASURES

Theorem 3.1. For all integers D (D ≥2), let li satisfy (2.4), then the generalized average

‘useful’ codeword length satisfies

(3.1) Lβα(U)≥Hαβ(P;U)

and the equality holds iff

(3.2) li =−logpαi + log

Pn

i=1uβipα+β−1i Pn

i=1uβipβi . Proof. By Hölder’s inequality [14]

(3.3)

n

X

i=1

xiyi

n

X

i=1

xpi

!1p n X

i=1

yqi

!1q

for allxi, yi >0, i= 1,2, . . . , n and 1p + 1q = 1, p <1 (6= 0), q <0,orq <1 (6= 0), p <0.

We see that equality holds if and only if there exists a positive constantcsuch that

(3.4) xpi =cyiq.

(4)

Making the substitutions,

p= α−1

α , q= 1−α,

xi = (uipi)α−1βα D−li Pn

i=1(uipi)α−1βα

, yi = (ui)1−αβ p

α+β−1 1−α

i

Pn

i=1(uipi)1−αβ in (3.3), we get

Pn

i=1uβipβ−1i D−li Pn

i=1uβipβi

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#α−1α "

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

#1−α1 . Using the condition (2.4), we get

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#1−αα

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

#1−α1 . Taking0< α <1and raising both sides to the power(1−α),

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#α

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

# .

Multiplying both sides by 21−α1−1 >0 for0< α <1and simplifying, we obtain Lβα(U)≥Hαβ(P;U).

Forα >1,the proof follows along similar lines.

Theorem 3.2. For every code with lengths{li}, i= 1,2, . . . , nof Theorem 3.1,Lβα(U)can be made to satisfy the inequality,

(3.5) Lβα(U)< Hαβ(P;U)D1−α+D1−α−1 21−α−1. Proof. Letli be the positive integer satisfying the inequality,

(3.6) −logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi ≤li <−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1.

Consider the interval, (3.7) δi =

"

−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi ,−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1

#

of length 1. In everyδi,there lies exactly one positive integerli such that (3.8) 0<−logpαi + log

Pn

i=1uβipα+β−1i Pn

i=1uβipβi ≤li <−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1.

We will first show that the sequence l1, l2, . . . , ln, thus defined satisfies (2.4). From (3.8), we have,

−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi ≤li, pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

≥D−li.

(5)

Multiplying both sides by uβipβ−1i and summing over i = 1,2, . . . , n,we get (2.4). The last inequality of (3.8) gives,

li <−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1 or

Dli <

pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

−1

D.

For0< α <1,raising both sides to the power 1−αα we obtain,

D1−αα li <

pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

α−1 α

D1−αα .

Multiplying both sides by u

β ipβi Pn

i=1uβipβi and summing overi= 1,2, . . . , n,gives Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β <

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β

!α1

D1−αα ,

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

<

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β

! D1−α.

Since21−α−1>0for0< α <1,after suitable operations, we obtain 1

21−α−1

"

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

−1

#

< 1 21−α−1

"

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β −1

#

D1−α+ D1−α−1 21−α−1. We can write

Lβα(U)< Hαβ(P;U)D1−α+D1−α−1 21−α−1.

AsD≥2,we have D21−α1−α−1−1 >1from which it follows that the upper bound,Lβα(U)in (3.5), is greater than unity.

Also, forα >1,the proof follows similarly.

REFERENCES

[1] R. AUTARANDA.B. KHAN, On generalized ‘useful’ information for incomplete distribution, J.

Combinatorics, Information and System Sciences, 14(4) (1989), 187–191.

[2] M. BELIS AND S. GUIASU, A quantitative-qualitative measure of information in Cybernetics system, IEEE Transactions in Information Theory, 14 (1968), 593–594.

[3] U.S. BHAKERANDD.S. HOODA, Mean value characterization of ‘useful’ information measures, Tamkang J. Math., 24 (1993), 283–294.

[4] P.K. BHATIA, On a generalized ‘useful’ inaccuracy for incomplete probability distribution, Soo- chow J. Math., 25(2) (1999), 131–135.

(6)

[5] A. FEINSTEIN, Foundation of Information Theory, Mc Graw Hill, New York (1956).

[6] S. GUIASU AND C.F. PICARD, Borne inferieure dela langueur de certian codes, C.R Academic Sciences, Paris, 27 C (1971), 248–251.

[7] GURDIAANDF. PESSOA, On useful information of orderα, J. Combinatorics, Information and System Sciences, 2 (1977), 30–35.

[8] D.S HOODA ANDU.S. BHAKER, A generalized ‘useful’ information measure and coding theo- rems, Soochow J. Math., 23 (1997), 53–62.

[9] D.S. HOODAAND A. RAM, Characterization of non-additive ‘useful’ information measure, Re- cent Advances in Information Theory, Statistics and Computer Applications, CCS Haryana Agri- cultural University, Hisar, 64–77 (1998).

[10] P. JAINANDR.K. TUTEJA, On coding theorms connected with ‘useful’ entropy of orderβ, Inter- national Journal of Mathematics and Mathematical Sciences, 12(1) (1989), 193–198.

[11] G. LONGO, A noiseless coding theorems for source having utilities, SIAM J. Appl. Math., 30 (1976), 739-748.

[12] G. LONGO, Quantitative-Qualitative Measure of Information, Springer Verlag, New York (1972).

[13] C.E. SHANNON, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 394–423, 623–656.

[14] O. SHISHA, Inequalities, Academic Press, New York (1967).

[15] R.P. SINGH, R. KUMARANDR.K. TUTEJA, Applications of Hölder’s inequality in information theory, Information Sciences, 152 (2003), 145–154.

[16] H.C. TENAJA, D.S. HOODA AND R.K. TUTEJA, Coding theorems on a generalized ‘useful’

information, Soochow J. Math., 11 (1985), 123–131.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

Abstract: This paper looks at two twentieth-century rewritings of Shakespeare’ s Measure for Measure: one by Bertolt Brecht, who in 1933 wrote a parable-play on contemporary

EBRAHIMI, Residual entropy and its characterizations in terms of hazard function and mean residual life time function, Statist. LONGOBARDI, entropy based measure of un- certainty

[16], Bhatia [4], Singh, Kumar and Tuteja [15] and Hooda and Bhaker [8] considered the problem of a ‘useful’ information measure in the context of noise- less coding theorems

In this note we present exact lower and upper bounds for the integral of a product of nonnegative convex resp.. concave functions in terms of the product of

A generalized Ostrowski type inequality for twice differentiable mappings in terms of the upper and lower bounds of the second derivative is established.. The inequality is applied

A generalized Ostrowski type inequality for twice differentiable mappings in terms of the upper and lower bounds of the second derivative is established.. The inequality is applied

Lower and upper bounds for αβ L u are derived in terms of useful information for the incomplete power distribution, p β.. Key words and phrases: Entropy, Useful Information,

Sharp lower and upper bounds for quasiconvex moments of generalized order statistics are proven by the use of the rearranged Moriguti’s inequality.. Even in the second moment case,