• Nem Talált Eredményt

CODING THEOREMS ON GENERALIZED COST MEASURE

N/A
N/A
Protected

Academic year: 2022

Ossza meg "CODING THEOREMS ON GENERALIZED COST MEASURE"

Copied!
13
0
0

Teljes szövegt

(1)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig vol. 9, iss. 1, art. 8, 2008

Title Page

Contents

JJ II

J I

Page1of 13 Go Back Full Screen

Close

CODING THEOREMS ON GENERALIZED COST MEASURE

RAYEES AHMAD DAR AND M.A.K. BAIG

Department of Statistics University of Kashmir Srinagar-190006, India.

EMail:rayees_stats@yahoo.com

Received: 6 June, 2007

Accepted: 13 December, 2007

Communicated by: N.S. Barnett 2000 AMS Sub. Class.: 94A17, 94A24.

Key words: Decipherable code, Source alphabet, Codeword, Cost function, Hölders inequal- ity, Codeword length.

Abstract: In the present communication a generalized cost measure of utilities and lengths of output codewords from a memoryless source are defined. Lower and upper bounds in terms of a non-additive generalized measure of ‘useful’ information are obtained.

(2)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page2of 13 Go Back Full Screen

Close

Contents

1 Introduction 3

2 Generalized Measures of Cost 5

3 Bounds on the Generalized Cost Measures 7

(3)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page3of 13 Go Back Full Screen

Close

1. Introduction

Let a finite set of n source symbols X = (x1, x2, . . . , xn) with probabilities P = (p1, p2, . . . , pn)be encoded usingD(D≥2)code alphabets, then there is a uniquely decipherable/instantaneous code with lengthsl1, l2, . . . , lnif and only if

(1.1)

n

X

i=1

D−li ≤1.

(1.1) is known as the Kraft inequality [5]. If L = Pn

i=1lipi is the average code word length, then for a code which satisfies (1.1), Shannon’s coding theorem for a noiseless channel (Feinstein [5]) gives a lower bound of L in terms of Shannon’s entropy [13]

(1.2) L≥H(P)

with equality iffli =−logpi ∀ i= 1,2, . . . , n.All the logarithms are to baseD.

Belis and Guiasu [2] observed that a source is not completely specified by the probability distributionP over the source symbolsX, in the absence of its qualitative character. It can also be assumed that the source letters or symbols are assigned weights according to their importance or utilities in the view of the experimenter.

Let U = (u1, u2, . . . , un) be the set of positive real numbers, where ui is the utility or importance of outcome xi. The utitlity, in general, is independent of the probability of encoding of source symbol xi,i.e.,pi.The information source is thus given by

(1.3) S=

x1 x2 · · · xn p1 p2 · · · pn u1 u2 · · · un

(4)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page4of 13 Go Back Full Screen

Close

where ui >0, pi ≥0, Pn

i=1pi = 1.

Belis and Guiasu [2] introduced the following qualitative -quantitative measure of information

(1.4) H(P;U) =−

n

X

i=1

uipilogpi

which is a measure of the average quantity of ‘valuable’ or ‘useful’ information provided by the information source (1.3). Guiasu and Picard [6] considered the problem of encoding the letter output of the source (1.3) by means of a single letter prefix code whose code wordsw1, w2, . . . , wnare of lengthsl1, l2, . . . , lnrespectively satisfying the Kraft inequality (1.1). They introduced the following ‘useful’ mean length of the code

(1.5) L(P;U) =

Pn

i=1uipili Pn

i=1uipi .

Longo [11] interpreted (1.5) as the average transmission cost of the letterxi and obtained the following lower bound for the cost measure (1.5) as

(1.6) L(P;U)≥H(P;U),

where

H(P;U) = − Pn

i=1uipilogpi Pn

i=1uipi

is the ‘useful’ information measure due to Guiasu and Picard [6], which was also characterized by Bhaker and Hooda [3] by a mean value representation.

(5)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page5of 13 Go Back Full Screen

Close

2. Generalized Measures of Cost

In the derivation of the cost measure (1.5) it is assumed that the cost is a linear func- tion of code length, but this is not always the case. There are occasions when the cost behaves like an exponential function of code word lengths. Such types occur fre- quently in market equilibrium and growth models in economics. Thus sometimes it might be more appropriate to choose a code which minimizes a monotonic function,

(2.1) C =

n

X

i=1

uβipβiD1−αα li,

whereα >0 (6= 1), β >0are the parameters related to the cost.

In order to make the result of the paper more comparable with the usual noiseless coding theorem, instead of minimizing (2.1) we minimize

(2.2) Lβα(U) = 1 21−α−1

"

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

−1

# ,

whereα > 0 (6= 1), β >0,which is a monotonic function ofC. We define (2.2) as the ‘useful’ average code length of orderαand typeβ.

Clearly, if α → 1, β = 1, (2.2) reduces to (1.5) which further reduces to the ordinary mean length given by Shannon [13] when ui = 1 ∀ i = 1,2, . . . , n.We also note that (2.2) is a monotonic non-decreasing function of αand if all the li ,s are the same, sayli =lfor eachiandα →1,thenLβα(U) = l.This is an important property for any measure of length to possess.

In the next section, we derive the lower and upper bounds of the cost function (2.2) in terms of the following ‘useful’ information measure of orderαand typeβ, (2.3) Hαβ(P;U) = 1

21−α−1

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi −1

# ,

(6)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page6of 13 Go Back Full Screen

Close

whereα >0 (6= 1), β >0, pi ≥0, i= 1,2, . . . , n, Pn

i=1pi ≤1.

(i) When β = 1, (2.3) reduces to the measure of ‘useful’ information proposed and characterised by Hooda and Ram [9].

(ii) Ifα→1, β= 1,(2.3) reduces to the measure given by Belis and Guiasu [2].

(iii) Ifα →1, β = 1andui = 1 ∀ i= 1,2, . . . , n,(2.3) reduces to the well known measure given by Shannon [13].

Also, we have used the condition (2.4)

n

X

i=1

uβipβ−1i D−li

n

X

i=1

uβipβi

to find the bounds. It may be seen that in the case when β = 1, ui = 1 ∀ i = 1,2, . . . , n, (2.4) reduces to the Kraft inequality (1.1). D(D ≥ 2)is the size of the code alphabet.

Longo [12], Gurdial and Pessoa [7], Autar and Khan [1], Jain and Tuteja [10], Taneja et al. [16], Bhatia [4], Singh, Kumar and Tuteja [15] and Hooda and Bhaker [8] considered the problem of a ‘useful’ information measure in the context of noise- less coding theorems for sources involving utilities.

In this paper, we study upper and lower bounds by considering a new function dependent on the parameterαand typeβ and a utility function. Our motivation for studying this new function is that it generalizes some entropy functions already ex- isting in the literature. The function under study is closely related to Tsallis entropy which is used in Physics.

(7)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page7of 13 Go Back Full Screen

Close

3. Bounds on the Generalized Cost Measures

Theorem 3.1. For all integersD (D≥2),letli satisfy (2.4), then the generalized average ‘useful’ codeword length satisfies

(3.1) Lβα(U)≥Hαβ(P;U) and the equality holds iff

(3.2) li =−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi . Proof. By Hölder’s inequality [14]

(3.3)

n

X

i=1

xiyi

n

X

i=1

xpi

!1p n X

i=1

yqi

!1q

for allxi, yi > 0, i = 1,2, . . . , n and 1p + 1q = 1, p < 1 (6= 0), q < 0,orq < 1 (6= 0), p <0.

We see that equality holds if and only if there exists a positive constant c such that

(3.4) xpi =cyiq.

Making the substitutions, p= α−1

α , q= 1−α,

xi = (uipi)α−1βα D−li Pn

i=1(uipi)α−1βα

, yi = (ui)1−αβ p

α+β−1 1−α

i

Pn

i=1(uipi)1−αβ

(8)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page8of 13 Go Back Full Screen

Close

in (3.3), we get Pn

i=1uβipβ−1i D−li Pn

i=1uβipβi

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#α−1α "

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

#1−α1 .

Using the condition (2.4), we get

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#1−αα

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

#1−α1 .

Taking0< α <1and raising both sides to the power(1−α),

"

Pn

i=1uβipβiD1−αα li Pn

i=1uβipβi

#α

"

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

# .

Multiplying both sides by 21−α1−1 >0 for0< α <1and simplifying, we obtain Lβα(U)≥Hαβ(P;U).

Forα >1,the proof follows along similar lines.

Theorem 3.2. For every code with lengths {li}, i = 1,2, . . . , n of Theorem 3.1, Lβα(U)can be made to satisfy the inequality,

(3.5) Lβα(U)< Hαβ(P;U)D1−α+D1−α−1 21−α−1. Proof. Letli be the positive integer satisfying the inequality, (3.6) −logpαi + log

Pn

i=1uβipα+β−1i Pn

i=1uβipβi ≤li <−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1.

(9)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page9of 13 Go Back Full Screen

Close

Consider the interval, (3.7) δi =

"

−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi ,−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1

#

of length 1. In everyδi,there lies exactly one positive integerli such that 0<−logpαi + log

Pn

i=1uβipα+β−1i Pn

i=1uβipβi (3.8)

≤li

<−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1.

We will first show that the sequence l1, l2, . . . , ln, thus defined satisfies (2.4). From (3.8), we have,

−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi ≤li, pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

≥D−li.

Multiplying both sides by uβipβ−1i and summing overi = 1,2, . . . , n,we get (2.4).

The last inequality of (3.8) gives,

li <−logpαi + log Pn

i=1uβipα+β−1i Pn

i=1uβipβi + 1

(10)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page10of 13 Go Back Full Screen

Close

or

Dli <

pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

−1

D.

For0< α <1,raising both sides to the power 1−αα we obtain,

D1−αα li <

pαi

Pn

i=1uβipα+β−1i Pn

i=1uβipβi

α−1 α

D1−αα .

Multiplying both sides by u

β ipβi Pn

i=1uβipβi and summing overi= 1,2, . . . , n,gives Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β <

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β

!α1

D1−αα ,

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

<

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β

! D1−α.

Since21−α−1>0for0< α <1,after suitable operations, we obtain 1

21−α−1

"

Pn

i=1(uipi)βD1−αα li Pn

i=1(uipi)β

!α

−1

#

< 1 21−α−1

"

Pn

i=1uβipα+β−1i Pn

i=1(uipi)β −1

#

D1−α+D1−α−1 21−α−1.

(11)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page11of 13 Go Back Full Screen

Close

We can write

Lβα(U)< Hαβ(P;U)D1−α+D1−α−1 21−α−1.

AsD≥2,we have D21−α1−α−1−1 >1from which it follows that the upper bound,Lβα(U) in (3.5), is greater than unity.

Also, forα >1,the proof follows similarly.

(12)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page12of 13 Go Back Full Screen

Close

References

[1] R. AUTARANDA.B. KHAN, On generalized ‘useful’ information for incom- plete distribution, J. Combinatorics, Information and System Sciences, 14(4) (1989), 187–191.

[2] M. BELISANDS. GUIASU, A quantitative-qualitative measure of information in Cybernetics system, IEEE Transactions in Information Theory, 14 (1968), 593–594.

[3] U.S. BHAKER AND D.S. HOODA, Mean value characterization of ‘useful’

information measures, Tamkang J. Math., 24 (1993), 283–294.

[4] P.K. BHATIA, On a generalized ‘useful’ inaccuracy for incomplete probability distribution, Soochow J. Math., 25(2) (1999), 131–135.

[5] A. FEINSTEIN, Foundation of Information Theory, Mc Graw Hill, New York (1956).

[6] S. GUIASU AND C.F. PICARD, Borne inferieure dela langueur de certian codes, C.R Academic Sciences, Paris, 27 C (1971), 248–251.

[7] GURDIA AND F. PESSOA, On useful information of order α, J. Combina- torics, Information and System Sciences, 2 (1977), 30–35.

[8] D.S HOODAANDU.S. BHAKER, A generalized ‘useful’ information measure and coding theorems, Soochow J. Math., 23 (1997), 53–62.

[9] D.S. HOODA AND A. RAM, Characterization of non-additive ‘useful’ infor- mation measure, Recent Advances in Information Theory, Statistics and Com- puter Applications, CCS Haryana Agricultural University, Hisar, 64–77 (1998).

(13)

Coding Theorems on Generalized Cost Measure

Rayees Ahmad Dar and M.A.K. Baig

vol. 9, iss. 1, art. 8, 2008

Title Page Contents

JJ II

J I

Page13of 13 Go Back Full Screen

Close

[10] P. JAIN AND R.K. TUTEJA, On coding theorms connected with ‘useful’ en- tropy of orderβ, International Journal of Mathematics and Mathematical Sci- ences, 12(1) (1989), 193–198.

[11] G. LONGO, A noiseless coding theorems for source having utilities, SIAM J.

Appl. Math., 30 (1976), 739-748.

[12] G. LONGO, Quantitative-Qualitative Measure of Information, Springer Ver- lag, New York (1972).

[13] C.E. SHANNON, A mathematical theory of communication, Bell System Tech- nical Journal, 27 (1948), 394–423, 623–656.

[14] O. SHISHA, Inequalities, Academic Press, New York (1967).

[15] R.P. SINGH, R. KUMAR AND R.K. TUTEJA, Applications of Hölder’s in- equality in information theory, Information Sciences, 152 (2003), 145–154.

[16] H.C. TENAJA, D.S. HOODAANDR.K. TUTEJA, Coding theorems on a gen- eralized ‘useful’ information, Soochow J. Math., 11 (1985), 123–131.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

Abstract: This paper looks at two twentieth-century rewritings of Shakespeare’ s Measure for Measure: one by Bertolt Brecht, who in 1933 wrote a parable-play on contemporary

The primary transcript for a eukaryotic mRNA typically contains two types of sequences: noncoding segments that break up the coding region are called introns, and the coding

a) The coding length is much incerased by the great number of separate elements v{ith different low probabilities in the case of irreducible codes. Beyond 16 bits,

[Ha07] V. Harangi, Periodic decomposition of functions, Real Anal. Laczkovich, A visit to the Erd˝ os problem, Proc. Hutchinson, Fractals and self-similarity, Indiana Univ.

Abstract: An objective measure for image quality assessment based on a direct comparison of visual gradient information in the test and reference images is proposed.. A

As a missing link of its practical implemen- tation, we investigated the minimum cost survivable routing problem (SRDC), showed that a minimum cost subgraph can be computed

In [6] we considered some nonlinear elliptic functional differential equations where we proved theorems on the number of weak solutions of boundary value problems for such equations

Motivated by the work of He and Wang [9], we obtain weak type regularity condition with respect to the space variables only for the gradient of the velocity field.. Sub- stituting