• Nem Talált Eredményt

In addition, some matrix trace inequalities related to the Tsallis relative entropy are studied

N/A
N/A
Protected

Academic year: 2022

Ossza meg "In addition, some matrix trace inequalities related to the Tsallis relative entropy are studied"

Copied!
7
0
0

Teljes szövegt

(1)

MATRIX TRACE INEQUALITIES ON THE TSALLIS ENTROPIES

SHIGERU FURUICHI

DEPARTMENT OFELECTRONICS ANDCOMPUTERSCIENCE

TOKYOUNIVERSITY OFSCIENCE

YAMAGUCHI, SANYO-ONODACITY

YAMAGUCHI, 756-0884, JAPAN

furuichi@ed.yama.tus.ac.jp

Received 28 June, 2007; accepted 29 January, 2008 Communicated by F. Zhang

Dedicated to Professor Kunio Oshima on his 60th birthday.

ABSTRACT. Maximum entropy principles in nonextensive statistical physics are revisited as an application of the Tsallis relative entropy defined for non-negative matrices in the framework of matrix analysis. In addition, some matrix trace inequalities related to the Tsallis relative entropy are studied.

Key words and phrases: Matrix trace inequality, Tsallis entropy, Tsallis relative entropy and maximum entropy principle.

2000 Mathematics Subject Classification. 47A63, 94A17, 15A39.

1. INTRODUCTION

In 1988, Tsallis introduced the one-parameter extended entropy for the analysis of a physi- cal model in statistical physics [10]. In our previous papers, we studied the properties of the Tsallis relative entropy [5, 4] and the Tsallis relative operator entropy [17, 6]. The problems on the maximum entropy principle in Tsallis statistics have been studied for classical systems and quantum systems [9, 11, 2, 1]. Such problems were solved by the use of the Lagrange multipli- ers formalism. We give a new approach to such problems, that is, we solve them by applying the non-negativity of the Tsallis relative entropy without using the Lagrange multipliers formalism.

In addition, we show further results on the Tsallis relative entropy.

The author would like to thank the reviewer for providing valuable comments to improve the manuscript. The author would like to thank Professor K.Yanagi and Professor K. Kuriyama for providing valuable comments and constant encouragement. This work was supported by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 17740068. This work was also partially supported by the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (B), 18300003.

216-07

(2)

In the present paper, the set ofn ×n complex matrices is denoted by Mn(C). That is, we deal withn×nmatrices because of Lemma 2.2 in Section 2. However some results derived in the present paper also hold for the infinite dimensional case. In the sequel, the set of all density matrices (quantum states) is represented by

Dn(C)≡ {X ∈Mn(C) :X ≥0,Tr[X] = 1}.

X ∈Mn(C)is called by a non-negative matrix and denoted byX ≥0, if we havehXx, xi ≥0 for all x ∈ Cn. That is, for a Hermitian matrixX, X ≥ 0means that all eigenvalues of X are non-negative. In addition,X ≥ Y is defined by X −Y ≥ 0. For−I ≤ X ≤ I and λ ∈ (−1,0)∪(0,1), we denote the generalized exponential function byexpλ(X) ≡(I+λX)1/λ. As the inverse function ofexpλ(·), forX ≥0andλ∈(−1,0)∪(0,1), we denote the generalized logarithmic function bylnλX ≡ Xλλ−I. Then the Tsallis relative entropy and the Tsallis entropy for non-negative matricesXandY are defined by

Dλ(X|Y)≡Tr

X1−λ(lnλX−lnλY)

, Sλ(X)≡ −Dλ(X|I).

These entropies are generalizations of the von Neumann entropy [16] and of the Umegaki rela- tive entropy [14] in the sense that

λ→0limSλ(X) =S0(X)≡ −Tr[XlogX]

and

λ→0limDλ(X|Y) =D0(X|Y)≡Tr[X(logX−logY)].

2. MAXIMUM ENTROPYPRINCIPLE INNONEXTENSIVE STATISTICALPHYSICS

In this section, we study the maximization problem of the Tsallis entropy with a constraint on theλ-expectation value. In quantum systems, the expectation value of an observable (a Hermit- ian matrix)Hin a quantum state (a density matrix)X ∈Dn(C)is written asTr[XH]. Here, we consider theλ-expectation valueTr[X1−λH]as a generalization of the usual expectation value.

Firstly, we impose the following constraint on the maximization problem of the Tsallis entropy:

Cfλ

X ∈Dn(C) : Tr[X1−λH] = 0 ,

for a given n ×n Hermitian matrix H. We denote a usual matrix norm by k·k, namely for A∈Mn(C)andx∈Cn,

kAk ≡ max

kxk=1kAxk. Then we have the following theorem.

Theorem 2.1. LetY =Zλ−1expλ(−H/kHk), whereZλ ≡Tr[expλ(−H/kHk)], for ann×n Hermitian matrixH andλ ∈ (−1,0)∪(0,1). IfX ∈ Cfλ, thenSλ(X) ≤ −cλlnλZλ−1,where cλ ≡Tr[X1−λ].

(3)

Proof. SinceZλ ≥0and we havelnλ(x−1Y) = lnλY + (lnλx−1)Yλfor a non-negative matrix Y and scalarx, we calculate

Tr[X1−λlnλY] = Tr[X1−λlnλ

Zλ−1expλ(−H/kHk) ]

= Tr[X1−λ

−H/kHk+ lnλZλ−1(I−λH/kHk) ]

= Tr[X1−λ

lnλZλ−1I−Zλ−λH/kHk ] =cλlnλZλ−1, sincelnλZλ−1 = Z

−λ λ −1

λ by the definition of the generalized logarithmic functionlnλ(·). By the non-negativity of the Tsallis relative entropy:

(2.1) Tr[X1−λlnλY]≤Tr[X1−λlnλX], we have

Sλ(X) = −Tr[X1−λlnλX]≤ −Tr[X1−λlnλY] =−cλlnλZλ−1.

Next, we consider the slightly changed constraint:

Cλ

X ∈Dn(C) : Tr[X1−λH]≤Tr[Y1−λH] and Tr[X1−λ]≤Tr[Y1−λ]

for a givenn×nHermitian matrixH, as the maximization problem for the Tsallis entropy. To this end, we prepare the following lemma.

Lemma 2.2. For a givenn×nHermitian matrixH, ifnis a sufficiently large integer, then we haveZλ ≥1.

Proof.

(i) For a fixed0< λ <1and a sufficiently largen, we have

(2.2) (1/n)λ ≤1−λ.

From the inequalities− kHkI ≤H ≤ kHkI, we have

(2.3) (1−λ)1λI ≤expλ(−H/kHk)≤(1 +λ)λ1I.

By inequality (2.2), we have 1

nI ≤(1−λ)1λI ≤expλ(−H/kHk), which impliesZλ ≥1.

(ii) For a fixed−1< λ <0and a sufficiently largen, we have

(2.4) (1/n)λ ≥1−λ.

Analogously to (i), we have inequalities (2.3) for−1< λ < 0. By inequality (2.4), we

have 1

nI ≤(1−λ)1λI ≤expλ(−H/kHk), which impliesZλ ≥1.

(4)

Then we have the following theorem by the use of Lemma 2.2.

Theorem 2.3. Let Y = Zλ−1expλ(−H/kHk), where Zλ ≡ Tr[expλ(−H/kHk)], for λ ∈ (−1,0)∪(0,1)and ann×nHermitian matrixH. IfX ∈ Cλ andnis sufficiently large, then Sλ(X)≤Sλ(Y).

Proof. Due to Lemma 2.2, we have lnλZλ−1 ≤ 0 for a sufficiently large n. Thus we have lnλZλ−1Tr[X1−λ] ≥lnλZλ−1Tr[Y1−λ]forX ∈ Cλ. Similarly to the proof of Theorem 2.1, we have

Tr[X1−λlnλY] = Tr[X1−λlnλ

Zλ−1expλ(−H/kHk) ]

= Tr[X1−λ

−H/kHk+ lnλZλ−1(I−λH/kHk) ]

= Tr[X1−λ

lnλZλ−1I−Zλ−λH/kHk ]

≥Tr[Y1−λ

lnλZλ−1I−Zλ−λH/kHk ]

= Tr[Y1−λ

−H/kHk+ lnλZλ−1(I−λH/kHk) ]

= Tr[Y1−λlnλ

Zλ−1expλ(−H/kHk) ]

= Tr[Y1−λlnλY].

By Eq.(2.1) we have

Sλ(X) =−Tr[X1−λlnλX]≤ −Tr[X1−λlnλY]≤ −Tr[Y1−λlnλY] =Sλ(Y).

Remark 2.4. Since−x1−λlnλxis a strictly concave function,Sλ is a strictly concave function on the setCλ. This means that the maximizingY is uniquely determined so that we may regard Y as a generalized Gibbs state, since an original Gibbs statee−βH/Tr[e−βH], whereβ ≡ 1/T andT represents a physical temperature, gives the maximum value of the von Neumann entropy.

Thus, we may define a generalized Helmholtz free energy by Fλ(X, H)≡Tr[X1−λH]− kHkSλ(X).

This can be also represented by the Tsallis relative entropy such as

Fλ(X, H) = kHkDλ(X|Y) + lnλZλ−1Tr[X1−λ(kHk −λH)].

The following corollary easily follows by taking the limit asλ →0.

Corollary 2.5 ([12, 15]). LetY = Z0−1exp (−H/kHk), whereZ0 ≡ Tr[exp (−H/kHk)], for ann×nHermitian matrixH.

(i) IfX ∈fC0, thenS0(X)≤logZ0. (ii) IfX ∈C0, thenS0(X)≤S0(Y).

(5)

3. ONSOME TRACEINEQUALITIESRELATED TO THE TSALLIS RELATIVEENTROPY

In this section, we consider an extension of the following inequality [8]:

(3.1) Tr[X(logX+ logY)]≤ 1

pTr[XlogXp/2YpXp/2] for non-negative matricesXandY, andp >0.

For the proof of the following Theorem 3.3, we use the following famous inequalities.

Lemma 3.1 ([8]). For any Hermitian matricesAand B, 0 ≤ λ ≤ 1and p > 0, we have the inequality:

Trh

epA]λepB1/pi

≤Tr

e(1−λ)A+λB , where theλ-geometric mean for positive matricesAandBis defined by

A]λB ≡A1/2 A−1/2BA−1/2λ A1/2.

Lemma 3.2 ([7, 13]). For any Hermitian matricesG and H, we have the Golden-Thompson inequality:

Tr eG+H

≤Tr eGeH

.

Theorem 3.3. For positive matricesX andY,p≥1and0< λ≤1, we have (3.2) Dλ(X|Y)≤ −Tr[Xlnλ(X−p/2YpX−p/2)1/p].

Proof. First of all, we note that we have the following inequality [3]

(3.3) Tr[(Y1/2XY1/2)rp]≥Tr[(Yr/2XrYr/2)p]

for non-negative matricesXandY, and0≤r ≤1, p >0. Similar to the proof of Theorem 2.2 in [5], inequality (3.2) easily follows by settingA = logXandB = logY in Lemma 3.1 such that

Tr[(Xp]λYp)1/p]≤Tr[elogX1−λ+logYλ]

≤Tr[elogX1−λelogYλ]

= Tr[X1−λYλ], (3.4)

by Lemma 3.2. In addtion, we have

(3.5) Tr[XrYr]≤Tr[(Y1/2XY1/2)r], (0≤r ≤1), on takingp= 1of inequality (3.3). By (3.4) and (3.5) we obtain:

Tr[(Xp]λYp)1/p] = Trh

Xp/2(X−p/2YpX−p/2)λXp/2 1/pi

≥Tr[X(X−p/2YpX−p/2)λ/p].

(6)

Thus we have,

Dλ(X|Y) = Tr[X−X1−λYλ] λ

≤ Tr[X−X(X−p/2YpX−p/2)λ/p] λ

=−Tr[X

((X−p/2YpX−p/2)1/p)λ −I ] λ

=−Tr[Xlnλ(X−p/2YpX−p/2)1/p].

Remark 3.4. For positive matrices X and Y, 0 < p < 1 and 0 < λ ≤ 1, the following inequality does not hold in general:

(3.6) Dλ(X|Y)≤ −Tr[Xlnλ(X−p/2YpX−p/2)1/p].

Indeed, the inequality (3.6) is equivalent to

(3.7) Tr[X(X−p/2YpX−p/2)λ/p]≤Tr[X1−λYλ].

Then we have many counter-examples. If we setp= 0.3,λ = 0.9andX =

10 3

3 9

, Y = 5 4

4 5

,then inequality (3.7) fails. (R.H.S. minus L.H.S. of (3.7) approximately becomes -0.00309808.) Thus, inequality (3.6) is not true in general.

Corollary 3.5.

(i) For positive matricesX andY, the trace inequality

Dλ(X|Y)≤ −Tr[Xlnλ(X−1/2Y X−1/2)]

holds.

(ii) For positive matricesX andY, andp≥1, we have inequality (3.1).

Proof.

(i) Putp= 1in (1) of Theorem 3.3.

(ii) Take the limit asλ→0.

REFERENCES

[1] S. ABE, Heat and entropy in nonextensive thermodynamics: transmutation from Tsallis theory to Rényi-entropy-based theory,Physica A, 300 (2001), 417–423.

[2] S. ABE, S. MART ´NEZ, F. PENNINI,ANDA. PLASTINO, Nonextensive thermodynamic relations, Phys. Lett. A, 281 (2001), 126–130.

[3] H. ARAKI, On an inequality of Lieb and Thirring, Lett. Math. Phys., 19 (1990), 167–170.

[4] S. FURUICHI, Trace inequalities in nonextensive statistical mechanics, Linear Algebra Appl., 418 (2006), 821–827.

(7)

[5] S. FURUICHI, K. YANAGI ANDK. KURIYAMA, Fundamental properties of Tsallis relative en- tropy, J. Math. Phys., 45 (2004), 4868–4877.

[6] S. FURUICHI, K. YANAGIANDK. KURIYAMA, A note on operator inequalities of Tsallis rela- tive opeartor entropy, Linear Algebra Appl., 407 (2005), 19–31.

[7] S. GOLDEN, Lower bounds for the Helmholtz function, Phys. Rev., 137 (1965), B1127–B1128.

[8] F.HIAI AND D.PETZ, The Golden-Thompson trace inequality is complemented, Linear Algebra Appl., 181 (1993), 153–185.

[9] S. MARTINEZ, F. NICOLÁS, F. PENNINIAANDA. PLASTINOA, Tsallis’ entropy maximization procedure revisited, Physica A, 286 (2000), 489–502.

[10] C. TSALLIS, Possible generalization of Bolzmann-Gibbs statistics, J. Stat. Phys., 52 (1988), 479–

487.

[11] C. TSALLIS, R.S. MENDESCANDA.R. PLASTINO, The role of constraints within generalized nonextensive statistics, Physica A, 261 (1998), 534–554.

[12] W. THIRRING, Quantum Mechanics of Large Systems, Springer-Verlag, 1980.

[13] C.J. THOMPSON, Inequality with applications in statistical mechanics, J. Math. Phys., 6 (1965), 1812–1813.

[14] H. UMEGAKI, Conditional expectation in an operator algebra, IV (entropy and information), Ko- dai Math. Sem. Rep., 14 (1962), 59–85.

[15] H. UMEGAKIANDM. OHYA, Quantum Mechanical Entropy, Kyoritsu Pub.,1984 (in Japanese).

[16] J. von NEUMANN, Thermodynamik quantenmechanischer Gesamtheiten, Göttinger Nachrichen, 273-291 (1927).

[17] K. YANAGI, K. KURIYAMA AND S. FURUICHI, Generalized Shannon inequalities based on Tsallis relative operator entropy, Linear Algebra Appl., 394 (2005), 109–118.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

Spontaneous macroscopic processes in isolated systems always increase the entropy?. The system gets into equilibrium when its entropy reaches its

We created an NP-corpus based on the Szeged Treebank and used it to train a Maximum Entropy model on the task of chunk-tagging, on the basis of which we created a statistical model

Abstract: Maximum entropy principles in nonextensive statistical physics are revisited as an application of the Tsallis relative entropy defined for non-negative matrices in

Abstract: Relative entropy with respect to normalized arclength on the circle is greater than or equal to the negative logarithmic energy (Voiculescu’s negative free en- tropy) and

Relative entropy with respect to normalized arclength on the circle is greater than or equal to the negative logarithmic energy (Voiculescu’s negative free entropy) and is greater

Key words: Uniqueness theorem, continuity property, Tsallis entropy and Fannes’ inequality....

In the previous paper [6], we gave the uniqueness theorem for the Tsallis entropy for a classical sys- tem, introducing the generalized Faddeev’s axiom.. Then we adopted the

Some of the basic inequalities in majorization theory (Hardy-Littlewood-Pólya, Tomi´c-Weyl and Fuchs) are extended to the framework of relative convexity.. Key words and