Exponential Operators
1 . Definition and Elementar/ Properties
T h e analysis of many problems in mathematics and physics is facilitated by the use of functions of operators and matrices. One of the most important matrix functions is the exponential function, defined by the power series expansion
Ar
r= 0 ' *
where A0 = 7. It can be shown (7) that this series converges for all A.
T h e exponential function of a matrix possesses the following properties:
(1) If Β is any matrix which commutes with A, then
[eA, B] = 0, [eA, eB] = 0, eA+B = eAeB. (1.2) T h e first and second relations follow from the fact that, if [A, B] = 0,
[Ar,B8] = 0 (r, s = 0, 1,2, ...), so that every integral power of Β commutes with every term in the series expansion for eA. T h e last equation of (1.2) can be verified by comparing the series expansions for eA and eB. It must be emphasized, however, that
eA+B φ eAeBy if [Α,Β]φ 0.
(2) If t is a scalar variable, and A is a matrix independent of ty then
AetA = AetA ( L 3)
480
EXPONENTIAL OPERATORS 4 8 1
1 Cf. A p p e n d i x I, Section 4.
This relation follows from a term-by-term differentiation of the power series for etA:
d ( y trAr ) _ y t*-*Ar y trAr
On the other hand, if A is a function of the variable t, then
£_ EAIT) φ *A EA(T) OR EA(T) u n l e ss
\
A_^L]
= 0dt ^ dt dt I dt 1
(3) If 5 is a nonsingular matrix, = S~1AnS9 so that
S - ^ S = %-\ (S-WS) = J 4 ( S - M S ) ' = es~lAS. ( 1 . 4 )
r=0 r* r=0 T'
An important application of ( 1 . 4 ) results when S is interpreted as the matrix which reduces A to the triangular form of Schur's theorem1:
S^AS = T\ Ti3- = 0 , when 7 < i.
For if Τ is triangular, then Tn (n = 1 , 2 , ...) is also triangular, and this implies that
S - V S = exp(S-M5) - er
is triangular.
If the matrix can be diagonalized, then eA can also be diagonalized.
Indeed, if S^AS = {Ai δ„), then
which shows that the diagonal elements of S~1eAS are the exponential functions of the eigenvalues of A. This property also holds in the case where S~XAS is triangular. For the diagonal elements of Tn are Tx l,
, so that the diagonal elements of eT are eTl1, eT22, . . . . But the latter are the eigenvalues of eT and their product is the determinant of eT:
det eT = eTneT22 ··· = eTU+TW" = etT T.
Since the trace and determinant are invariant under a similarity trans- formation, it follows that
deteA=eiTA. ( 1 . 5 )
Equation (1.5) shows that eA is always nonsingular. In fact, the inverse of eA is
(e*)-1 = e-\ (1.6)
since
e
A
e-A _
e-A
eA _ g(A—A)
e0 _ /
If ^4 = *7/, where H is hermitian, then the adjoint of eiH is
T h u s is unitary if i / is hermitian. A similar argument can be used to prove that eA is orthogonal when A is skew-symmetric.
T h e preceding results also hold for exponential operators; the matrix forms for exponential operators appear upon introducing a basis and representing the operators by matrices.
T h e series expansion for the exponential function of a matrix A can be reduced to a polynomial in A if the integral powers of A obey simple recurrence relations. This is illustrated by the following examples:
1. Let
* - < a-
so that A2 = — θ21, A3 = — ίθ3Α, A* = 04/ , ... . Inserting these results into (1.1), one obtains
Α
ι
η , · Λ - Û / cos 0 / sin 0\eA = J cos θ + ι A
sin
θ = . . Λ\ι
sin
θ cos θ I2. Let A be the η χ η matrix Ό 1 0 0
0 0 1 0 0 0 0 1
0 0 0
Ô^0 0 0 0 It is easily verified that An = 0; hence
Α2 Αη~λ
e* = / +A + - +•
2! 1 1 (η - 1)! "
EXPONENTIAL OPERATORS 483 In particular, when η = 3,
/ Ο 1 0 \ / Ο 0 1 \
i 4 = 0 0 1 , A2 = 0 0 0 , ^3 = 0 ,
\ 0 0 0 / \ 0 0 0 /
eA = l+A+%A* = 0 1 1 .
\ 0 0 1 /
3. A very interesting example is provided by the skew-symmetric matrix
(
0 c o s γ — c o s β\— c o s y 0 c o s α c o s β — c o s α 0
with
c o s2 α + c o s2 β + c o s2 y = 1 .
T h e square of A is
(
— s i n2 α c o s α c o s β COS α c o s y \ c o s α c o s β — s i n2 β c o s β c o s y I, c o s α c o s y c o s β c o s y — s i n2 y /but ^3 - - A , A* = - , 42, , 45 = ^ , so that
e0A = 1 + ^ s i n Φ + ( 1 — c o s Φ ) , 42.
Upon inserting the matrices for A and ^42 into the right-hand member of the last equation, it is easily shown that
6ΦΑ _ R(n0^ n — ( c o s a , c o s β, c o s y ) ,
where Β(ηΦ) is the three-dimensional rotation matrix derived in Appendix II.
2 . Expansion of eA+B
T h e decomposition of eA+B into a product of two or more simpler exponential functions is a problem of some difficulty (2) when [Α, Β] Φ 0.
It is not difficult, however, to obtain a formal expansion for eA+B in terms of "transformed powers" of B. This expansion is quite useful when Β can be regarded as a small correction or perturbation on A.
T h e expansion is most readily obtained by introducing an auxiliary scalar parameter s, and defining Q(s) by
Q(s) = es{A+B). (2.1)
Evidently
ρ(0) = /,
Q(l)=e^
B, ^L = (A+B)Q. (2.2)
T h e differential equation for Q(s) can be written esA Ys{e~sAQ) = BQ,
or, upon multiplying from the left with e~sA, as
^{er"Q) = er"BQ. (2.3)
Integrating this equation from zero to s, one obtains
Q(s) = esA j 1 + f e-°'ABQ{s') ds' j. (2.4) Equation (2.4) can be used to obtain an expression for Q{s) by replacing
s with s' and s' with s". Inserting the expression for Q(s') into the inte- grand of (2.4) yields
Q(s) = e*A j / + Ç e-*'ABe8'A ds' + Ç e~8'ABe8'A [J* e~*'ABQ(s") ds"} j. (2.5)
Iterating this process, one finds that
Q(s) = e*A j / + f e-*'ABe*'A ds' + f e-s'ABe«'A ds' f e-*°ABe*"B ds"
' J0 ^ 0 ^ 0
+ ··· + f e-*'ABe8'A ds' Γ e-*"ABe8'A ds"
Γ« ί « - η j
··· J exp(-^>,4)£exp(s<
n),4)^<
n)+ — j. (2.6)
T h e expansion for eA+B follows upon putting s = 1.
Equation (2.6) expresses exp[s(A + B)] as a power series in B. If Β = 0, correctly reduces to eS / l. If A = 0, the multiple integrals are easily evaluated, and one obtains the series
/ +sB + ψΒ* + - = e*B.
E X P O N E N T I A L O P E R A T O R S 485 Moreover, if [A, B] = 0, then all factors in Β can be removed from the integrals, and Q(s) gives the correct factorization esAesB.
Except for the limiting cases just noted, the integrations indicated in (2.6) are not easily carried out. However, if one introduces the matrix representatives of A and B> the integrations become quite simple. It will be assumed that the chosen representation diagonalizes A> and that the eigenvalues of A are nondegenerate:
A = (Ak 8kj), e'A = 8W) , Β = (Bkj).
A particular term of (2.6) is said to be of the nth order if it involves η integrations. T h e order of a given term is equal to the number of times Β occurs in that term. For example, the first-order term is
esA f s ' A ß e s ' A e J 0
and its matrix elements are given by
XXX
e'** SnC
e-*'Ai hlmBmnes'An Snk ds'l m n J 0
= e**i e-'^B,*'** ds' = Bjk — - j - .
J 0 ί ~ Ά
Similar computations lead to the following expansion for the matrix elements of es{A+B)sup to terms of the third order:
[ e ^ ]j k = e^Sjk + Bj k e
A , - A k
|
y
BnBlk \ e s* - e s AA > _ e si - * - » Ί Aι Ak Ax \ Ak Aj At Ak \
+ Χ X
A B ik — A°
B a r B r k { eSAkr \{Ak — Aj)(A~~
k — AeSAj eSAqQ) (AQ — A~~
5){Ak — AeSAj q)+ n r - ^ A A + (2-7) (Ar — Aq)(Ar — Aj) (Ar — Ag)(Ag — Aj)
R E F E R E N C E S
1. P a u l R. H a i m o s , " F i n i t e - D i m e n s i o n a l V e c t o r S p a c e s , " 2 n d ed., C h a p . I V . V a n N o s t r a n d , P r i n c e t o n , N e w Jersey, 1958.
2. R. P . F e y n m a n , Phys. Rev. 84, 108 (1951).