markov chains on hypercubes: spectral representations and several majorization relations
TRANSCRIPT
Markov Chains on Hypercubes: Spectral Representations and Several Majorization Relations
Samuel Karlin* Department of Mathematics, Stanford University, Stanford, CA 94305
Bo Lindqvistt Division of Mathematical Sciences, University of Trondheim, Norwegian Instirute of Technology, N- 7034 Trondheim, Norwa y
Yi-Ching YaoS Department of Statistics, Colorado State University, Fort Collins, CO 80523
ABSTRACT
Various Markov chains on hypercubes 3; are considered and their spectral representations are presented in terms of Kronecker products. Special attention is given to random walks on the graphs Yl(l = 1, . . , n) where the vertex set is 3; and two vertices are connected if and only if their Hamming distance is at most 1. It is shown that A ( Y l ) > A ( Y l ) > X(Ymp,_ > A(Yn)), 1 = 2, . . . , n - 2, where X(Yl) is the spectrum of the random walk on Yl, and > denotes the majorization ordering. A similar rnajorization relation is established for graphs V, where two vertices are connected if and only if their Hamming distance is exactly 1. Some applications to mean hitting times of these random walks are given. 0 1993 John Wiley & Sons, Inc.
* Supported in part by NIH Grants GMHG00335-03, GM10452-28, AI08573, and NSF Grant
t Supported by The Norwegian Research Council for Science and Humanities. *Work done while on leave at Stanford.
DMS86-06244.
Random Structures and Algorithms, Vol. 4, No. 1 (1993) @ 1993 John Wiley & Sons, Inc. CCC 1042-9832/93/01oOo1-36
1
2 KARLIN, LINDQVIST, AND YAO
1. INTRODUCTION
We shall consider the n-dimensional hypercube
equipped with the group operation componentwise addition modulo 2, i.e., (ql, . . . , v,,) + (7; , . . . ,$,) = (vl + 71; , . . . , v, + $,), (mod 2). A random walk on 2; is a Markov chain {X,; t = 0, 1, . . .} with one step transition probabilities given by
p66' = P ( X , + , = 6 ' 1 X , = 6 ) = a s . + s (1.1)
where a, 2 0, a, = 1. We interpret a, as the probability of a change of state
by q over successive transitions. A classical example is the nearest neighbor random walk,
V E S T
LO otherwise
This random walk can be considered as a regular random walk on the graph Vl with vertex set 2; and edges connecting q and q' if and only if Iq + q'l = 1. An extensive study of this random walk concerning the total variation distance between the transition probabilities and their limit is given in [5]. An asymptotic analysis of various random walks on groups (including the Vl random walk) is presented in Aldous [l]. More generally, let V,(l= 1, 2 , . . . , n) denote the graph induced on 3; having edges connecting q and q' if and only if Iq + q'( = 1. The regular random walk on V, corresponds to the probability array
l o otherwise.
We shall also consider random walks, Y,, 1 = 1, 2, . . . , n with edges connecting q and q' if and only if 11 Iq + q'l I 1. In this case
l o otherwise.
Note that the random walks on V, (1 I n - 1) are periodic (period 2) for 1 odd, and reducible (with two equivalence classes) for 1 even (V, has 2"-' equivalence classes). In order to avoid periodicity, one might introduce positive probabilities of no movement. Thus in the nearest neighbor random walk, it is customary to let
MARKOV CHAINS ON HYPERCUBES 3
- forlql = O for lql= 0 or 1
otherwise (0 otherwise
The random walk on 7fl is also recognized as the (labeled) Ehrenfest mode1 where the state q defines the positions of n labeled balls, with respect to two urns named I and 11. More precisely, vi = 1 (0) if the ith ball is in urn I (11). At each time t a ball is drawn at random, and moved to the other urn.
The Ehrenfest model is commonly coalesced to a Markov chain on the states (0, 1, . . . , n}, representing the number of balls in urn I. If X, is the random walk on W;, then the process Y, = IX,l can be construed as the standard Ehrenfest model. If X, is the random walk on 7 f l , then Y, = IX,l on (0, 1,. . . , n} is a modified Ehrenfest model where 1 balls are sampled simultaneously and each is moved to the other urn. It is of interest to compare this Markov chain with the standard Ehrenfest chain, observed at times 1, 21, 31, . . . . The distinction between the modified Ehrenfest model and the standard chain observed at times 1 apart corresponds to sampling 1 balls without and with replacement.
Multiurn models have also been studied in the literature. Here the n balls are distributed into m urns, in which case the state is described by an (m)-dimensional vector giving the number of balls in urns 1, 2 , . . . , m. At each stage a ball is szlected at random and placed into urn j with probability p , ( j = 1, . . . , m),
pi = 1. Related models with applications in genetics are the Moran type dodels [7, 8, 131. Again n balls are distributed into m urns. At each stage two balls are chosen at random with replacement. The first ball is returned to its urn together with a new ball (reflecting a birth). The second ball is removed from its urn (reflecting a death). There are also continuous time versions of these models; see, e.g., [6]. We shall not be concerned with these generalized models here. On the foregoing several Ehrenfest models, spectral representation formulas are available which feature explicit time-varying transition probabilities (see Section 2). Extensions of results of this article to these cases would be of interest.
The principal objective of the present article is to compare the transition matrices of the random walks Vl and Yl on 3;. As is seen from (l.l), these matrices are all symmetric. We shall consider the ordering of symmetric matrices by rnajorization characterized as follows. Let A and B be two symmetric (real) matrices of order N and their real eigenvalues A,(A), . . . , &(A), and h , ( B ) , . . . , A,(B), respectively. We say that B majorizes A , written B > A if
where the symbol > is the majorization ordering in R" expressed by the inequalities
4 KARLIN, LINDQVIST, AND YAO
where A(,,(A) is the ith largest value among A,(A), . . . , &(A). (If the equality condition is replaced by an inequality, then we say B weakly majorizes A denoted by B > * A.) See the majorization treatise of Marshall and Olkin [12]. As shown in [lo] (see also [3]), B > A if and only if there exist orthogonal matrices
U , , . . . , Ur and non-negative numbers q, . . . , or, C ai = 1 such that 1
A = aiUiBUi i = l
where ' denotes the transpose. The relation B > A implies cP(B) 2 @(A) for convex functionals @ defined on the set of symmetric matrices. An example of a convex functional is the average expected hitting time, defined by
* ( A ) = c m, r1 TI 1 . J
where mij is the expected hitting time of state j , starting in the state i with A as the transition matrix (see Section 5 ) and { T,} is the stationary distribution.
Our main results concern the majorization comparison of the random walks on the graphs VI and Yl defined previously. Since the number of vertices connected to a given vertex of YI is increasing in 1, one might surmise that 9, > Y2 > . - * > Y,, where we use the same notation for a graph and the associated transition matrix. However, Y#Y3 for large n. The following relations hold, and are the first main results of this article.
Theorem 1.1. ( i ) For odd I, 3 I 1 < n , we have Tl > V,; (ii) For odd I, 1 I 1 < n , for even n , where A(2)(VI) denotes
the second largest eigenvalue of TI; (iii) For odd 1 and l', 1 I I, I' < n, A(2,(VI) > h,,,(VI.) if and only if ( 7 ) < (:). Theorem 1.2. ( i ) For 2 9 Is n, we have 9, > YI; (i i) For 1 I 1 I n - 2 , we have
I Z F , A ( , , ( Y I ) = 1 1 - : l , a n d A ( 2 , ( T ; ) = - 1 2 n - 1
I I
Y I > Y n p l ; (iii) For 1 1 1 1 n , A(*)( .YI )=z ( 1 ) ( 1 - 2 ) > / Z ( r ) which is de- creasing in 1.
I= 1 n I = l
Note that the graph Y,, is a clique such that every pair of vertices are with multiplicities 1 and
1 connected. The spectrum of 9" consists of 1 and - -
2" - 1 2" - 1 , respectively. Therefore Y,, is majorized by 9'; and TI for all 1. By Theorem 1.2, we have
MARKOV CHAINS ON HYPERCUBES 5
9, > Yl > Yn-, > Yn; 1 = 2,3, . . . , n - 2 .
Consider the N-vertex cycle YN, where the N vertices are arranged on a circle and each vertex is connected to its neighbors. Obviously, Y4 is identical to Y, with n = 2 . For n > 2 , we have
Theorem 1.3. For n 2 3, Y2. > 9,.
These results are proved with the knowledge of the explicit eigenvalues and eigenvectors of the transition matrices through Kronecker product representations (Section 2), which for the models Yl and Yl can be expressed in terms of the classical Krawtchouk polynomials. Section 3 summarizes the requisite properties of Krawtchouk polynomials including new inequalities. Section 4 elaborates the proofs of Theorems 1.1-1.3. Section 5 shows that the average expected hitting time of a time-reversible irreducible Markov chain is monotone with respect to the spectrum. Some open problems are indicated in the final section of this article.
It should be remarked that random walks on graphs belong to the important class of time-reversible Markov chains, i.e., Markov chains for which there exist y,>O, 1 S i s N w i t h
yipij = yipj, for all i, j .
In this case the matrix Y = diag(y’”)P diag(y-”’) is symmetric, so there exist a set of N real eigenvalues A,(P) 2 A,(P) 2 * * * 2 h,(P) for P and the possibility of the majorization ordering P > Q(A(P) > A( Q)) is meaningful.
2. SPECTRAL REPRESENTATIONS FOR RANDOM WALKS ON 3:
In this section we derive the eigenvalues and eigenvectors of the transition matrices P corresponding to general random walks on 3;. Our approach relies on a representation of P in terms of Kronecker products. The Kronecker product of two matrices A and B in coordinate form is defined as
a1,P a 1 9 * - . a 1 . p
a 2 P a 2 . P . * * a24 . ”j A n x p @ B m x q = ( : * . . .
an;,B an;,B 1 . . an;pB n m x p q
and the special case of the Kronecker vector product becomes x@y = ( x l y , , x1y2, . . . , x l y q , x 2 y , , . . . , x 2 y q , . . . , x p y l , . . . , x p y q ) . It is familiar and easily checked that ( A n x p @ B,, , ) (x@y) = Ax@ By. It follows that the eigenvalues of A @ B are given by all possible products hp, where A and p are eigenvalues of A and B, respectively. A complete set of eigenvectors are given by the Kronecker products of the eigenvectors of A and B.
Consider now a random walk on 2’; generated by a probability distribution (a,,: q E 2‘;). It is convenient to order the states q in lexicographic order, i.e.,
6 KARLIN, LINDQVIST, AND YAO
q < q' iff for some r = 1, . . . , n, we have 77,. = 77; for i = 1, . . . , r - 1 and 77, < 77: . Set
It is easy to see that for all q E Ti, the matrix
J'rl) Jh) @ . . €3 J ( S J
carries a 1 in entry (a, p) if f3 = a + q, and 0 otherwise. Here the indices (a, f3) refer to the lexicographic order on 2; in each dimension. The transition matrix of the random walk defined by the (a,) has the representation
In order to compute the eigenvalues and eigenvectors of P we first note that J'de@) = (-l)"ae(s) for all 77, S = O or 1 where e'" = (1, - l ) , e") = (1, 1).
Let now 6 = ( A l , S,, . . . , an) , where Si = O or 1, and define e") = e'") mee'%' €3 . . . €3e'6n'. Then e") is a vector with entries labeled by q E T", in lexicographic order, and
(e'"), = (-1)(6.d (2-2) n
where (6,q) = c S i q . Morever, i = l
~ ( 1 1 ) ~ ( 6 ) = ( J ( V I ) ~ ( % ) 18 . . . ~ ( ~ ( ? n ) e ( % ) = (-1)(q.6)e(,)
showing that a complete set of orthogonal eigenvectors of J'") consist of the vectors e"), with corresponding eigenvalues (- l)'",'). Since the eigenvectors e(') of J'") do not depend on q, it follows that these are the eigenvectors of P in (2.1) as well. The eigenvalue of P corresponding to e") is
The spectral representation for the powers P" (s = 1 , 2, . . .) is
where A, is given by (2.3). Using (2.2) the entries of P' can be written
Consider now the regular case in which a, depends on q only through Iql. n
It Writing ak for a, with lql= k (k = 0, 1 , . . . , n), note that 2 ( k ) a k = 1. The
k = O
MARKOV CHAINS O N HYPERCUBES 7
following identity is germane. Let 6 E 3; be such that 161 = m. Then
where k A m = min{k, m} and Kk(x, n) is the Krawtchouk polynomial [see Eq.
Using (2.6) we find from (2.3) that the eigenvalue of P for 16) = m with (3.211.
multiplicity (L) (m = 0, 1, . . . , n) is
The spectral representation (2.5) in the case when a, depends on q through lql becomes [using (2.6)]
Now let X, be a random walk on 3; with a, depending on q through Iql. Consider the stochastic process Y,= IX,l (see Section l), with state space (0, 1, . . . , n}. From Theorem 6.3.2 of Kemeny and Snell [ll] it follows that Y, is a Markov chain. With P given as before, the transition matrix Q of Y, is given by
Q = UPV
where V = (V,,) and U = (Uk,) are matrices of order 2" x (n + 1) and (n + 1) x 2", respectively, given by
( l i f h l = k uk,= ["' k if lql= k 0 otherwise
'qk = 0 otherwise
for all qEZ;, k E (0, 1 , . . . , n}. For any 6 with 161 = m we have by (2.6),
where we have used the symmetry K , ( f , n) = K,(m, n) [see (3.7)]. Therefore it follows from (2.4) that Qs has the spectral representation
In particular, it follows that Q has eigenvalues A,, . . . , An given by (2.7), each
8 KARLIN, LINDQVIST, AND YAO
with multiplicity 1. Moreover, the stationary distribution of Y, is the binomial
distribution with parameters , for any values of the a k .
Remark. We briefly indicate how the derivation of eigenvalues and eigenvectors for the random walk on Sq can be generalized to random walks on 2'; = {(v,, . . . ,v,,): qi E (0, 1 , . . . , p - 1)) for p > 2 . To ease the exposition, let p = 3. Note first that a symmetric random walk on the state space (0, 1, 2) has transition matrix
Let .I(') be the 3 x 3 identity matrix and define for any q E 9; the 3" x 3" matrix .I(") = J("l) 63 . . 63 Next, for a, 2 0 , a, = 1, consider
llE9 7
P = 2 a,J(') . ,-q
Then P is a symmetric matrix which is the transition matrix of a random walk on 9; with the properties: starting at any state y = ( yl, . . . , y,,) E S:, let a, be the probability of changing exactly the components i corresponding to vi = 1. Having chosen the components i to be changed, let the two possible changes of each component state have equal probability.
We now consider the derivation of eigenvalues and eigenvectors of P. The Let ( 1 ) ( 2 ) , e , e be orthogonal eigenvectors of
1 1 eigenvalues of .I(') are I, - - , - - 2 2 '
J' ), with
for 77 =0, 1 and y = O , 1, 2. Defining, for y = (x,. . . , y , ) E 9 ~ , e(') =e('l) 60 * - * €3e(Yn), it follows that
Thus the 3" vectors e"" form an orthogonal set of eigenvectors of J ( " ) and hence of P, with eigenvalue of P corresponding to e'" given by
MARKOV CHAINS ON HYPERCUBES 9
1 n As an example let u,, = - if lql= 1 and a,, = 0 othzrwise. Then (2.8) implies that
3m the eigenvalue corresponding to any y with 2 (yi A 1) = m is A,,, = 1 - ~
i = l 2n (m = 0, 1, . . . , n), and A, has multiplicity (:)2". This example corresponds to an Ehrenfest model with three urns, where at each stage a ball is chosen at random and placed with equal probability in one of the two other urns. (The interpretation of yi is thus the labelling (0, 1 or 2) of the urn containing the ith ball.)
3. PROPERTIES OF THE KRAWTCHOUK POLYNOMIALS
In this section, we first review basic properties of the Krawtchouk polynomials and then establish new inequalities. These more technical results are needed in Section 4 to prove Theorems 1.1-1.3. The reader can skip directly. to Section 4 and refer back where necessary. The Krawtchouk polynomials are defined by either of the equivalent formulas
or
for 1 = 0, 1, . . . , n and 0 < p = 1 - q < 1 [ 151. The generating function
is an easy consequence of (3.2). In Section 2, we wrote K, (x , n) for Kl The Krawtchouk polynomials are orthogonal with respect to the binomial
distribution with masses ( :)pXq"' at the points x = 0, 1, . . . , n:
I
where .rr, = (")( ') . Observe the special values 1 9
The recurrence relation
10 KARLIN, LINDQVIST, AND YAO
is valid for all complex x if 1 = 0, 1, . . . , n - 1 but only for x = 0, 1, . . . , n if l = n .
It follows from (3.1) and (3.2) that
K,, x , - , 2 n = K x n , - , 2 n = 0 , x = l , 3 , 5 , . . . , 2n-1 ( : ) ( ; 1 which implies
The following relations connect polynomials with different n K - , ( x , p , n - 1 ) + (n - I )K , (x , p , n - 1) = nKl(x, p , n)
1 = 0, 1, . . . , n, all complex x , (3.10)
I , x = O , l , . . . , n , ( a l l x i f l c n ) . (3.11)
Using the generating function formula (3.3) and the recurrence (3.6), one can show that the polynomials
I
W , p , n) = C K;(x, p , n ) q i=O
1 = 0,1, . . . , n - 1 (3.12)
satisfy the recurrence relation
MARKOV CHAINS ON HYPERCUBES 11
They can be expressed in the form
(3.14)
The above properties can be found either in [15] or [6]. For the remainder of this section, we use these properties to derive new facts with p = - on the Krawtchouk polynomials which are of some interest and also needed later. For simplicity, we write K,(x, n ) for K,(x, 5 , n ) . Also [ x ] denotes the greatest integer not exceeding x .
1 2
1
Remark. The easy fact that K , ( x , n ) = 1 - 2x /n will be used implicitly at many places.
Proposition 3.1. For all n 2 2,
I K,(x, n ) l ~ K , ( x , n) for all non-negative integer x I - [ “ ; ‘ I 9
1=1,2 ,..., n - 1 . (3.15)
For n even,
K (” - , n 1 = O i f Z i s o d d , l I l I n - l , (3.16) I 2 and
I K / ( 5 7 n ) I 5 - i f 1 is even, 21 11 n - 2 . (3.17)
Proof. Direct for 1 = 1 and for 1 = n - 1, since by (3.7), (3.8), l K n - l ( x , n)l = 1 K , ( x , n)l for all x = 0, 1, . . . , n. First assume that (3.16) and (3.17) hold. We prove (3.15) by induction on n. Note that (3.15) holds for n = 2. Suppose it holds €or n - l (n 2 3 ) . Then by (3.10) we have
1 n - 1
(3.18)
for all x , and 1 = 0, 1, . . . , n. By the induction assumption, for 2 5 1 5 n - 2,
x I [ q], we obtain
I n - 1 IKI(x, n ) l s ;lKI-l(x, n - I ) \ + - n IK,(x, n - 1)1
n - 1 K , ( x , n - 1) I K , ( x , n ) l ~ ; K , ( x , n - 1) + - n
1
= K, (x , n - 1) 5 K,(x , n ) .
12 KARLIN, LINDQVIST, A N D Y A O
n - 2 n - 1 Now [ 7 1 = [ 7 1 if n is even, so in that case we have proved (3.15).
If n is odd, then we need to check (3.15) for x = - (and 2 5 I 5 n - 2). We apply (3.16) and (3.17) to (3.18). Note that whether l is even or odd, only one term will remain on the right-hand side of (3.18), yielding
n - 1 2
n - 1 establishing (3.15) for n odd and x = -
(3.9) for 1 even, n even, we have
2 - It remains to prove (3.16) and (3.17). Here (3.16) follows from (3.8). From
and it is straightforward to check that
for l = 2.4, . . . , n - 2 , 1
completing the proof.
Corollary. (i) For r 5 n - 1 and 0 5 x 5 [ 51 - 1,
(3.19)
n - 1 r + l 2 2 (ii) For x = - (assuming n is odd), (3.19) holds ifr is even or if- is an odd
r + l . 2 integer. If - IS an even integer, then
I= 1 K l ( q , n ) < O .
Proof. (3.12) and (3.14) give, for l = 0, 1, . . . , n,
n + l 2 K, (x , n) = - [ K , + , ( x + 1, n + 1) - K,(x + 1, n + l)]
Therefore, for 1 5 r 5 n,
MARKOV CHAINS ON HYPERCUBES 13
By (3.15), this is nonnegative provided r + 1 5 n and x + 1 I . This proves
I f x = - , then by (3.16), the right-hand side of (3.20) equals 0 when r + 1
is odd. If r is odd, then K,+,(x + 1, n + 1 ) = K,,, (q, n + 1 # 0 is given by r + l (3.9), which is negative if and only if 7 is an odd integer. This proves (ii).
[ ; I )
n - 1 2
(9.
,5
The following proposition is stronger than Proposition 3.1, but with a more restrictive range of x and 1.
Proposition 3.2. For n 2 8 , 0 I x I [ - 2 ' 1 ( x integer), and 3 I 1 I n - 2
n - x + l n + l K,(x, n ) . I K / k n + 111 5
We need some ancillary lemmas.
Lemma 3.3. For x , 1 = 0, 1 , . . . , n ,
X n + l - x K,(x, n + 1 ) = - Kl(x - 1, n ) + K/(X, n ) . n + l n + l
Proof. By (3.10),
1 n + l - 1 KLx, n ) . n + l Kl(x, n + 1 ) = - KI- 1 ( x , n) + n + l
By (3.7), the left-hand side of the above identity equals Kx(l, n + l ) , and the 1 n + l - 1
right-hand side equals - K,(l- 1 , n ) + Kx(l, n). Interchanging x with 1 yields the lemma. n + l n + l
Lemma 3.4. For n odd and 1 even (1 I n - l ) ,
Kl( 9, n ) = Kl( T, n + 1 ) .
n + l Proof. By Lemma 3.3 with x = - 2 '
, n + 1 ) = j 1 K/( 9, n) + 5 1 K/( F, n )
= - 1 K / ( ~ , n ) f 5 ( - 1 ) ' K 1 ( ~ , n ) , n - 1 1 n - 1 by(3.8) 2
n - 1
Lemma 3.5. For n odd and 1 even (1 5 n + l ) ,
14 KARLIN, LINDQVIST, A N D YAO
Proof. By Lemma 3.3,
n + l K,( y, n + 1)
K , ( T , n + 1 ) n + 2 - ( n + 1 ) / 2
n + 2 +
- - n + l n + 3 K , ( F , n + l ) . 2(n + 2)
+
K,( n+l, n + 1 ) 2(n + 2 )
K , ( T , n + 2 ) - n + l 2 - - 2 ( n + 2 ) K,(?,n+3)-= n + 3 K , ( = , n + l ) ,
n + l 2 by Lemma 3.4
- - n + l n + 2 n + 3 1
- - n + l ( 1 - f ) ( 1 - * * - (1 - ;) , by (3.9)
1 n + 1 - 2 1 = ( 1 - f ) ( l - i). . . ( 1 - --)
+
Lemma 3.6. For n odd and 1 odd (1 I n) ,
KI( y , n + 1 ) = ( 1 - ? ) ( I - y). . . ( 1 - y) n+l 21 .
Proof. By (3.10),
n - 1 K, - ( 2 , n ) +
+ -
2 n + l K, ( , n ) 1
-- - I (F,n+l)+ n-+iyl K , ( F , n ) , n + 1 K / - l
by Lemma 3.4 - 1 n - 1 --
n + l
n - 1 - KI- l ( 2, n - 1) ,
n + l - 1 1
+ - K , ( ~ , n - l n - 1 n - 1 n
=- 1 K , _ , ( ~ , n + l ) + n + l n + l n + l
since K,( y, n - 1 = 0 for I odd(by (3.16)) 1
+ n + l - 1 . - ( 1 1 n + l n - $2)
MARKOV CHAINS ON HYPERCUBES 15
x ( 1 - - ' i1)--*(l-5) n - 2 ' by(3.9)
= ( 1 - T)( 1 - 3) * ' . ( 1 - --) n+l . 1 - 1 1 - 1 1 - 1 21
Proof of Proposition 3.2. The proposition is verified direct1 for n =8. Now
suppose it holds for n - l (n 2 9) . Then for n, and x I [ni21, - 3 5 1 S n - 3 ,
X n + l - x I K , ( ~ , n + 111 = I - n + l K,(x - 1, n ) + n + l K,(x, n)l , by Lemma 3.3
x n - x + l
n + l - x n - x n + l n
5-. K,(x - 1 , n - 1 )
K,(x, n - 1 ) ,
n + l n
+ .-
by induction assumption
n - x + l n + l - - K,(x, n ) , by Lemma 3.3.
If n is even, then [ q] = [ F]. It remains to consider the two cases: (i) n - 1
2 l = n - 2 and (ii) n odd, x = - , 3 1 1 ~ n - 2 .
For 1 = n - 2, by (3.7) and (3.8),
Kn-2(x, n + 1) = K,(n - 2, n + 1 ) = (-1)"K,(3, n + 1 ) = (-1)'K3(x, n + 1 ) .
So, the case I = n - 2 is equivalent to the case 1 = 3, which is already proved.
For case (ii) n odd, x = - , 3 I I I n - 2, it follows from Lemmas 3.5 and 3.6 that
n - 1 2
(-l)"21.3* - - ( I - 1 ) II + 1 - 21 , leven / n - 1 n ( n - 2 ) - . . ( n - l + 2 ) n + l
( 1 - 2 ) . 21 1 odd. (-1)"-1)'21.3.. . n ( n - 2 ) - . . ( n - l + 3 ) n + l '
(3.21)
K , ( T , n + 1) =
We need to show for odd n 2 9,
16 KARLIN, LINDQVIST, AND YAO
We first prove the odd 1 case by induction on n (n odd). It is easily seen that (3.22) holds for n = 9. Suppose (3.22) holds for (odd) n and 3 I (odd) 15 n - 2, (n 2 9) . Consider the case n + 2. First fix (odd) 1 with 3 I 1s n - 2. Then by (3.211,
n - 1 + 3 n + l n + 2 n + 3
.- n + l
n + 3 n - 1 + 3 n + l n - 1 + 3 - - .- 5 2n(n+1) n + 2 n + 3 2 n ( n + 2 ) -
It suffices to show
n - 1 + 3 n + 5 2n(n + 2) 2(n + 2)(n + 3 ) ’
or equivalently,
-1+3 2 5-
n n + 3 ’
which certainly holds for 1 1 3.
and (3.8) Next consider 1 = n (odd). But this case is equivalent to 1 = 3, since by (3.7)
So, we have proved (3.22) for 1 odd. Now consider the case of even 1. Since by (3.7) and (3.8)
n + l it suffices to consider 1 5 ~ . But I K/( 9, n + 1 ) I is decreasing in (even) 1 5 - , since by (3.21),
n + l 2 2
n + l 2 f o r l s - - 2 5 1 ,
Thus, it suffices to prove (3.22) with 1 = 4. By (3.21),
1.3 n - 7 , n + l = .- K4( y ) n ( n - 2 ) n + l ’
n + 3 2n(n + 1 ) .
which is bounded by This completes the proof.
The following two lemmas are needed in Section 4.
MARKOV CHAINS ON HYPERCUBES 17
Lemma 3.7. For odd n 2 9, 3 I 11 n - 2,
Proof. For odd 1, Kl( T, n + 1 ) = 0 by (3.8). By Proposition 3.2, (3.23) holds. Now consider even 1 2 4. We have, by Lemma 3.5,
, n + 1 ) and Kl( F, n + 1 n + l
For - 2 signs, so that (3.24) is bounded by
n + l 2 by Propositions 3.1 and 3.2. For 4 1 1 < - , (3.24) is decreasing in 1. Thus, it
suffices to check 1 = 4. For 1 = 4, (3.24) equals
3(3n + 5 - 16) n(n - 2)(n + 3 ) ’
1 which is I- completing the proof. n’
Lemma 3.8. For even n 2 8 , 3 5 1s n - 2,
n + 4 l K l ( : - l , n + l Kl( i, n + 1)I I n(n + 1 ) (3.25)
1 I KI( :, n + 1)11 - n + l (3.26)
Proof. By Lemma 3.3,
X n + 2 - x Kl(x, n + 2) = - Kl(x - 1 , n + 1 ) + Kl(x, n + 1 ) . n + 2 n + 2
Setting x = nl2,
n + 4 K l ( : , n + l ) .
18 KARLIN, LINDQVIST, A N D YAO
So the left-hand side of (3.25) equals
n (n + 1)- - + 1
2(n + 2) 2 n n (n + 1) + 1
x K1( 5, n + 1) , by Proposition 3.2
n + 4 n(n + 1) ’
- -
proving (3.25). To prove (3.26), note that for even I ,
Thus, (3.26) is equivalent to
1 <- (I - 1) 1 . 3 ... (n + l)(n - 1). - . ( n - I + 3) - n + 1 ’
which is obviously true. So (3.26) holds for even I, 4 c 1 I n - 2. For odd 1, 3 5 1s n - 3, by (3.7) and (3.8),
Since n + 1 - I is even between 4 and n - 2, we have
completing the proof.
4. PROOFS OF MAJORIZATION RESULTS
In this section, we prove the majorization results stated in Section 1 by using the exact expressions for eigenvalues derived in Section 2 along with the inequalities on Krawtchouk polynomials of Section 3.
Lemma 4.1. Let u l , u; , . . . , u k , u; be vectors such that ui > ui, i . e . , ui majorizes u’ I ) i = 1 , . . . , k . Then ( u ~ , . . . , u k ) > (ui, . . . , u;).
This lemma is immediate using the majorization relation expressed via doubly stochastic matrices (see [12, pp. 21-22]).
MARKOV CHAINS ON HYPERCUBES 19
Proof of Theorem 1.1. Note that by (2.7), the spectrum of Vl consists of eigenvalues K,(x, n) with respective multiplicities (:), x = 0, 1 , . . . , n.
(i) We partition X(V,), the spectrum of V,, into subvectors h'"'(Vl), x = 0,
1 , . . . , [ i] where A")(T), O S X < consists of (:) copies of the values Kl(x, n) and K,(n - x , n), and for even n, h'""'(V,) consists of ( ny2) copies of
Proposition 3.1,
2 '
= 0. By (3.8), K,(x, n) = -K, (n - x , n) when I is odd. By (3.15) of
h'"'(Vl) >X'"'(V,), x = 0,1, . . . , [ 51.31 (odd) I < n .
Now, part (i) follows from Lemma 4.1. (ii) Note that (3.15) is equivalent to
F o r o d d l r n - [", 'I, - x = 1,2, . . . , n - 1
IKAx, n)I = IKn-/(x, n)l, by (3.8)
5 Kfl-/(19 n), by (4.1)
= -K,( l , n), by (3.7) and (3.8)
= K,(n - 1, n), by (3.8) and 1 odd.
So, A(21(V,) = K,(n - 1, n) = lK,(l, n)l = l K l ( l , n)l = 11 -2Unl for n - [",'I5 - n 2
By (3.7), (3.16), and (3.17), IKnlz(x, n ) l 5 -
(odd) 1 < n. It remains to consider the case (odd) 1 = - when n is even.
~ = l , . . . , II - 1. But
K,,,(n - 2, n) = -Kn/2(2, n) = - by (3.9). Also, KnI2(n, n)= -1. SO,
A V ) = -
1 n - 1 ' 1
n - 1 1 n - 1. :. (2)( n / 2
Finally part (111) follows immediately from (ii), completing the proof of Theorem 1.1.
To prove Theorem 1.2, we introduce the random walk W, on %; where two vertices are connected if they differ in either I - 1 or 1 components. Note that W2 = Y2. Clearly, W, has eigenvalues [see (2.7)]
20 KARLIN, LINDQVIST, AND YAO
with multiplicities (:), x = 0, 1 , . . . , n. For even I , K,(x, n + 1 ) = K,(n + 1 - x , n + l ) , so x and n + 1 - x correspond to the same eigenvalue K,(x, n + 1 ) with the
total multiplicity (:) + ( . But if n + 1 is
even, the eigenvalue K,(
Proposition 4.2. For even I , 2 1 5 n - 2, we have 9, > W,.
To prove the proposition, we introduce a notion of dominance and provide a lemma.
Definition. Let A = (a l , . . . , a,) and B = ( b I , . . . , b m ) be two vectors of real numbers. We say that A is dominated by B if
(i) the ai’s and bi’s are all of the same sign, (ii) for each k = 1 , . . . , n
i= 1 i = 1
where the a7.s and br’s are the descending arrangement (in magnitude) of the ai’s and bj’s, respectively, i.e.,
Lemma 4.3. Let A = ( a , , . . . , a, ) and B = (b , , . . . , b,) be two vectors of real
numbers with c ai = b j . Suppose that there exist a partition A , , . . . , A , of A
( i e . , the subheitors A: are dkjoint and their union equals A ) and dhjoint subvectors B,, . . . , B, of B such that A , is dominated by B,, (Y = 1 , . . . , r . Then A < B.
n n
Proof. Without loss of generality, assume that a, 2 a, 2 * * - 2 a,, b , 2 b, 2 - - * 2 b,. For each k, we need to show
k k
i=l i = l
(i) Case 1: bk 2 0. Let k’ = max{ i 5 k : ai > O}. It suffices to show
k ’ k’
a i s c b i . i= I i = 1
MARKOV CHAINS ON HYPERCUBES 21
Let G, = A, f l { a l , . . . , a k , } . Since A, is dominated by B,, there exists (possibly empty) H, C B, such that ( H , I 5 I G, I and the sum of the numbers in G, 5 sum of thenumbersinH,. Since IH,UH,U...UH,I.IG,UG,U...UG,(=k', we have
t. t' 2 ai 5 sum of the numbers in HI U - - - U H, 5 2 bi . i = 1 i = 1
(ii) Case 2: b , < 0. It is enough to prove that n 2 a i r x b, with b,,, <o,
i = k + l i = k + l
n n n
since x ai = z b i . Let k = min{i 2 k + 1 : a, < O}. It suffices to show z a, 2 z b i . %ing an argument analogous to that used in Case 1, we accomplish the proof.
i = 1 k"
k"
Proof of Proposition 4.2. We partition the vector A(W/) into subvectors A,,
0 1 x I [ 71, where A,, 0 I X < - , consists of ( ) copies of the value
K/ (F, n + l ) .
n + l n + l n + l 2
& ( x , n + l), and (for n odd) consists of copies of the value
We first prove A(Yl) > A(W2). We partition the vector A(Yl) into subvectors [":'I, where B,, I s x < - , consists of (:) copies of ~ , ( x , n) B , , O S x S -
) copies of ~ , ( n - x + I , n), B, consists of one copy of K, (o , n n + l and ( n - x + 1
n) = 1, and B(n+1),2 (for n odd) consists of ((n + 1) /2) copies of K 1 ( -
n + l 2
2 ' n) = -:. It is easily seen that lA,I = IB,I and the sum of the elements in A, equals that of the elements in B,, since for x = - (n odd), K, (F, n + 1) = -; by (3.9), and for 1 I x < -
n + l 1 n + l 2
2 ' n + l ( )K&, n + 1) = ( X -k ')K,(2, n + 1) , by (3.7)
Since the elements in A, are identical, A, < B,. By Lemma 4.1, A(Yl) > A(W2). We now consider 4 I (even) l5 n - 2. The proposition holds for n < 8 by direct
verification. In the following, assume n 1 8 . We partition A(Y,) into subvectors B,' , B; , 0 I x 5 [ 7 1 where B i consists of (:) copies of the value K l ( x , n),
n - 1
22 KARLIN, LINDQVIST, AND YAO
B,- consists of (:) copies of the value K,(n - x , n) = - K , ( x , n). [For even n,
there are also ( n;12) copies of K , ( 5, n ) = 0 in A(Y1)-] By Lemma 4.3, it suffices to establish the following facts:
(i) F o r x = 1 , 2 , ..., [ 91 - 1, A, is dominated by B z or B , , (depending on the sign of A,).
(ii) For odd n, if A(n-1) /2 and A(n+l)12 are of the same sign, then A(n-1)/2 U A(,+l)12 is dominated by B i - l ) / 2 or BG-1)/2; if they are of opposite signs, then and A(,+l ) /2 are dominated, respectively, by B;-1)/2 and B(i-l)/2 or by BG-1)12 and B&l) /2 (depending on the signs).
U An12 is dominated by B:/2-l or Bi/2- l ; if they are of opposite signs, then and An12 are dominated, respectively, by B:/2-1 and B,,- , or by B,,- , and
(iii) For even n, if An12-l and AnI2 are of the same sign, then
B :/ 2 - 1 .
Proof of (i). Since IA,I = (': ') ?(:) = IB:I = IBCl, it suffices to show
This follows from Proposition 3.2.
Proof of (4. Note that (( n '+' - 1)/2 )2&( 2 (n '+' + 1)/2 ) = ( ( n - 1 ) / 2 ) . Ineitherof the two cases in (ii), it suffices to show
n
But (4.1) follows from Proposition 3.2, (4.2) follows from (3.17), and (4.3) follows from Lemma 3.7.
n + l n + l n Proof of (iii). Note that ( 5 ) > (5 - l ) > (5 - l). In either of the cases in
(iii), it suffices to show
MARKOV CHAINS O N HYPERCUBES 23
n + l n + l n I(; - +/( 5 - 1, n + 1) + ( 5 ) K / ( 5 , n + 1) I 5 (5 - l )K*( 5 - 1, n) . (4.6)
But (4.4) follows from Proposition 3.2, (4.5) follows from (3.26), and (4.6) follows from (3.25). The proof of Proposition 4.2 is complete. rn
Proof of Theorem 1.2. (i) For even I, 2 5 I S n - 2, A(Y/) is a weighted average of A(W2), A( W4), . . . , A(W/). Since A(Wk) < A(Y,) for k = 2, 4, . . . (k I n - 2) by Proposition 4.2, we have A(Y/)<h(Y,). For odd I , 3 5 1 1 n - 1, A(Y/) is a weighted average of A(YlWl) and A("Y;), both of which are majorized by A(Yl) = A(Vl) (by Theorem 1.1). Thus A(Y/) < A(Yl). So A(Y,) < A(Y1) for 1 I n - 2. The case I = n - 1 is covered in part (ii).
(ii) The random walk 9, has eigenvalues
with multiplicities (:), x = 0, 1, . . . , n. For each I, partition A(Yl) into 3 subvectors A m , / , a =0, 1, 2 where A0,/ consists of the single value A,(O) = 1, A1./ consists of the eigenvalues (counting multiplicities) corresponding to all odd x , and A*./ consists of the eigenvalues
It is known that A,(x) = 7 -1
2 - 1
€or even x 2 2. In particular, the elements in Am,n-.l are identical for each a.
for 1 I 1 5 n - 2, a = 0, 1, 2. To this end, it suffices to show that the sum of the elements in Am,/ equals 0 for a = 1 and equals -1 for a = 2. By (3.3) with s = -1 and s = 1, we have for l s f s n - 1
1 So, A,- , (x) = O for odd x, and = - 2n-1- 1
Part (ii) will be proved if we can establish Am,/ >
x=o 5 ( : ) K , ( x , n ) = x = o ( : ) K , ( x , n ) ( - l ) " = O . (4.7)
It follows that for 1 5 1 I n - 1,
24 KARLIN, LINDQVIST, AND YAO
c ( : ) K I ( x , n ) = O and 2 ( : ) K I ( x , n ) = - l . (odd)x (even)x 2 2
So, for l s l s n - 1 ,
I I
(odd)x c ( : ) ~ / ( x ) = i = l c ( y ) (odd)x c (z>Ki(x, n ) / z ( 7 ) = O
I
(even)xrZ c ( : ) A , ( x ) = C i = l (7) (cven)x c 2 2 ( : ) K i ( x 9 n ) / i 1 = 1 ( ; ) = - I ,
I completing the proof of (ii).
(iii) We remind the reader that the eigenvalues of YI are 2 ( y ) (:)
i = 1 I I
Ki(x, n ) / z (r ) with multiplicities ( z ) . We first establish A(2)(YI) = i = l
(1 - $)/i ( 7 ) . The case 1 = n is trivial. It suffices to show x 2 2
I = 1
I
2 ( 7 ) K i ( l , n)? i ( C ) K i ( x , n). i = l i = 1
for l s l ~ n - 1 ,
(4.8)
(iiia) Case 1: odd x < n. Note that
= sum of the (E) + (If) + * - * + (;) largest eigenvalues
in A ( 9 , ) I
2 C ( 7 ) K J ~ , n), since ~ ( 9 ~ ) > ~ ( 7 ~ ) 1 =O
I
= c ( ")Ki(X, n) 7
r = O
proving (4.8).
show (4.8) for even 1. In this case, (4.8) is equivalent to (iiib) Case 2: x = n. Since Ki( l , n) 2 Ki(n , n) = (-1)' €or odd i, it suffices to
c ( ; ) z i ( ; )L . i = l n (odd)i</
But the two sides are equal since
(odd) i</ r = O 1 = l n
MARKOV CHAINS ON HYPERCUBES 25
n 2 (iiic) Case 3: even x , odd n. For I < -,
I
2 ( l?) KnPx( i* , n) , since n - x is odd and r=o
' ( y l ) > ' ( y n - . x ) 7
where i* = i * ( i ) = i if i is even, and = n - i if i is odd. I
= 2 ( ; ) ( - l ) i K n - x ( i , n) , by (3.8) and n - x odd i = O
I
= 2 ( ; ) ( - l ) iKi (n - x , n) i = O
I
= c (;)Ki(x, n)
2 ( ; ) ~ ~ ( l , n ) = i = l + l i: (;)Kl(i,n)
i = O
n 2 For I > -,
i = l + l
n = sum of the ( I + + - * + (:) smallest
eigenvalues in A(Y1)
5
= i: ( ; ) ( - l ) iKn- .x( i , n)
= c ( ; )Ki (x , It).
(:) K n J i * , n) , since A(Y1) > A ( y n - * ) i = l + l
i = I + l
n
i = l + l
I I n
so, c ( ~ ) K , ( I , n) 2 [ s e e r ~ ~ . 7 ) ] . Cornbinin;&a), (iiib), (iiic) proves (4.8) for odd n.
( ; ) ~ , ( x , n), since c ( ; ) ~ , ( x , n) = o for 1 5 x 5 n - 1 i = O
(iiid) Case 4: even n, 2 5 x 5 n - 1, 1 < n. It suffices to show
c ( " ) K i ( l , n) 2 c ( " ) K i k n) . (4.10)
Since ( ; ) K i ( x , n ) = ( i - l ) K i - l ( x , n - 1 r ~ - l ) + ( ~ ; ~ ) K ~ ( x , n - 1 ) by (3.10), (4.9) is equivalent to
( o d d ) i s l (odd) i rl
26 KARLIN, LINDQVIST, AND YAO
where 1* = 1 if 1 is even, and = I - 1 if 1 is odd. But this inequality is already established since n - 1 is odd. Inequality (4.10) can be proved similarly.
is decreasing in 1, we observe A(2)(Yl) > Kl+,(l, n) since Ki(l, n) is decreasing in i. It follows that A ( 2 ) ( Y l ) > A(2,(SPI+1), since h(2)(Y/+1) is a weighted average of A(,)(Y/) and Kl+,(l , n). The proof of Theorem 1.2 is complete.
Finally, to see that
Proof of Theorem 1.3. Define
L(- I )=o , I,(/)=(:)+(:)+ +(;), I = O , I , . . . , n
Now,lettingN=2",A(%,)=(A1,. . . , AN)withAi=cos-* , i = l , 2 , . . . , N
(see [9, pp. 119-1211) and A(Yl) = ( p I , . . . , p,) with pi = - for L(I - 1) < n for i = l , i s L ( 1 ) ; l = O , 1 , . . . , n. Note that p i = - p 2, . . . , N. Hence it is enough to prove that
2.rr[i/2] N n - 21
i = 1 i = l
Moreover, since pi are constant for L(1- 1) < i 5 L(I) , it suffices to prove that
(4.11)
n n 2 2
Note that the case 1 = - (when n is even) is equivalent to the case 1 = - - 1, since
L ( n / 2 ) L ( n / Z - I ) L ( n / 2 ) L ( n / 2 - 1 ) c EL;= c EL;. i = 1 i = l i = l i = l C h i = A i ,
n 2 To prove (4.11), let 1 5 I < - be fixed and write L = L(I). Then
L I c p i = l + n - l c ( n - 2 m 1 ( ~ ) n i = l m = l
= l + ( L - l ) - - 2 / c m(,> n n m = l
5 1 + ( L - 1)[1- F] n
20 2 where we have used the fact that (:) is increasing in m when rn < -.
so we have We next consider the A's. It is well known that cos 0 2 1 - - for 0 5 0 5 .rr/2,
.rr
MARKOV CHAINS ON HYPERCUBES 27
21rj 4J . . N cos - 2 1 - - N > l57* N
Using this inequality we get
i = l j=l N N
1 1 + ( L - 1)[1- y] where c ( L ) = 1 if L is even, and =2 if L is odd. Therefore, to establish (4.11) it suffices to show that
1 + (1 - - ) ( L L + 2 - l ) r 1 + ( L - q ( 1 - "') N n '
or equivalently,
L + 2 1 + 1 N n I-
n 2 Since 1 < - ,
L 1 + 1 1 + 1 1 + 1 N n + l n n ( n + l ) ' - s-=---
But
hence
L + 2 < 1 + 1 - - N - n '
and we are done with the case n 2 5. The case n < 5 can be checked directly.
5. FUNCTIONAL ON MARKOV CHAINS THAT IS MONOTONE WITH RESPECT TO THE SPECTRUM
The objective of this section is to show that the average expected hitting time of a time-reversible irreducible Markov chain is monotone with respect to the spec-
28 KARLIN, LINDQVIST, AND YAO
trum. Let {XI, 1 = 0, 1,. . .} be an N-state irreducible Markov chain with transition matrix P = IIp,,llNxN and stationary distribution IT = (IT’, . . . , vN). Let Ti =inf{ lz 1: XI = j}, and mi, = E(TjIX,, = i ) , i.e., T,is the first (hitting) time to visit state j and mii is the expected time required to reach state j, starting in state
Fix a set of states S. Let for i and j in S, , f !;) be the probability, starting in stmate i , that the first return to a state in S occurs to state j at time v. Set
c. , f f ) = f T i and mf = 2 vsfj,? the mean first return time starting from it:te i E S to the set S. When S is the set of all states, ,c , = Pi,. It is obvious that ,F = ~ ~ s ~ i [ ~ i . i E s is a stochastic matrix and it can be proved that the vector a / r s
1.
j € S ”-
the stationary frequency vector of ,F. It can also be if P is reversible.
Proposition 5.1. Suppose there exist a complete set of left (&’) and right (@) biorthogonal eigenvectors (i-e., (cp: ’ , +y)) = ~5,~) in Euclidean IS\-space (to ease the notation we suppress the symbol S of ( P ( ~ ) and Jl‘k’) such that
(V(~)),F = A f ( q ~ ( ~ ) ) , sF(+‘k’) = k = 1, . . . , IS1 .
Then for i , j E S
where A s = 1 and ms is the vector (rnS),€,. For S the total stafe space this formula reduces to
since then ms = (1, . . . , 1) and ( ~ p ‘ ~ ’ , (1, . . . , 1)) = 0 for k # 1.
Remarks. We take (p“) = =/IT’ = IT* , +(’) = ( 1 , . . . , 1) restricted to the coordi- nates of S.
The formula (5.lb) with P symmetric is due to A. Z. Broder and A. R. Karlin [41.
Proof. The standard equations for first passage probabilities are
f ! ; ) = , f !y ) + C s f $ ’ f g - ” - C , f ~ ~ ’ f ~ ~ ~ ” ’ for i , j E s . Y R E S Y
Passing to generating functions and differentiation produces the equation
m(i) = ms + ,Fm(’) - $mji
S where m(i) = (m,i)iES, ms = (m, ,f: = (s f t j ) , E s or equivalently
MARKOV CHAINS ON HYPERCUBES 29
(5.2)
Since I T * ~ F = m* we have by (5.2)
(c"), m*) = (m*, (I - sF)m(i)) = (m*(I - SF), m(')) = 0 .
We use the expansions
and by orthogonality to cp") = m*
and
IS1
(I - ,F)m(" = (cp"', m"')(l - Af)+(') I = 1
On the basis of (5.2) we deduce
and therefore
This yields
and thereby the formula (5.la). The proof is complete.
Multiplying by rj and summing on j E S yields the identity
since (cp'", (1, . . . , 1)) = 0 for 1 # 1. Settingj = i in (5.la), multiplying by rj and summing on j yields
30 KARLIN, LINDQVIST, AND YAO
since (m*, +(')) = O for I # 1.
Definition. The average expected hitting time of defined as
m = C m,rjmij . 1 . 1
an irreducible Markov chain is
Corollary 5.2. [4]. Under the conditions of Proposition 5.1, the average expected hitting time 13 given by
A. Z. Broder and A. R. Karlin used this formula to bound cover times of random walks (see also [2]).
If P is time-reversible, then the conditions of Proposition 5.1 are satisfied. (Time-reversibility is defined at the end of Section 1.) Letting h(P) be defined as before, we get the result:
Proposition 5.3. Suppose P, Q are transition matrices of time-reversible irreducible Markov chains of the same order. Then A(P) > * A ( Q ) implies m(P) 2 m ( Q ) where >* denotes weak majorization.
Proof. The real function f ( x ) = - is convex and increasing for x > 1. There- fore h(P) > * X ( Q ) implies that
X
1 - x
which by Corollary 5.2 gives m(P) I m ( Q ) .
The "average" expected hitting time m is defined in terms of the stationary distribution { m i ) . It is also of interest to consider the quantity max 2 miiuiui over { u i } subject to ui 10, c ui = 1. The following is a brief discussion on this quantity.
Note that if P is time-reversible, then we may choose left eigenvectors q"' and right eigenvectors +'" so that (i) q'"P = A,q'"; (ii) P+") = A&@); (iii)
Lemma 5.4.
MARKOV CHAINS ON HYPERCUBES 31
Proof. Note that p..cp!" p..cp(r' c2L= j Tj 2- j Ti (reversibility )
cp I" = A , - ,
Ti
indicating that cp"'/a is a right eigenvector of P for the eigenvalue A,. Checking the normalizations the result of Lemma 5.4 follows. By Proposition 5.1 and Lemma 5.4, if P is reversible, then
which is non-positive if h k
defined on the simplex A = for all k. Thus, the function T ( u ) = z miiuiuj R N : u 2 0, 2 ui = l} is concave if A, L 0 for all
k, since T(u + t&) is concave in t for any f with 2 ti = 0. On the other hand, if hk 5 0 for k = 2, . . . , N, then T(u) is convex on A.
= -I + -P, which necessarily has only non-negative eigenvalues. (Again assume that P is reversible so that a! the eigenvalues are real.) So f(u) = 2 fiijuiuj is concave in u E A. Since P and P have the same stationary dis- tribution, f i j i = inij = -. Also it is easily verified that fiij = 2mij for i # j . It
would be interesting to find the maximum of F(u), u E A. As a consequence of (5.3) with S being the total state space, 2 f i i i ~ j is independent of i. If 2 rifiij
does not depend on j (which is the case when P is symmetric), then the maximum occurs at u = P since for any f with 2 5; = 0
1 1 2 2 Let
1
i I
1 N So, for random walks on regular graphs, u = P = - (1, . . . , 1) maximizes F(u).
However, in general, c 7riGij does depend on j , therefore rn is not the maximum point. For example, consider the random walk on a graph of three vertices (1, 2, 3) with the edge connections (1, 2) and (2, 3). Clearly, mlz = 1 < mZ1, so that mI2 = 2 < fiZl. In Equation (5.5) setting (i = 1, j = 2) and (i = 2, j = l ) , we have *
c Tfi,, > z T r f i r 2 - If P is not the maximum point, then f i = 2 fiijvirj # max f(u). It would be interesting to evaluate and compare max f'(u) for various Markov chains.
32 KARLIN, LINDQVIST, AND YAO
6. SOME OPEN PROBLEMS
(1) Let X, be the random walk on 'Yl, whose transition matrix for simplicity we also denote by 7fl. For 1 > 1 the matrix 'V: corresponds to the chain { X o , X I , X,, , . .}. As mentioned in Section 1 , the chain { X o , X,, X,, . . .} is the (labeled) Ehrenfest model where at each time t a ball is sampled at random and moved to the other urn. Thus the chain { X o , X,, &,. . .} can be interpreted as if one samples 1 balls at a time, with replacement. On the other hand, the random walk with transition matrix can be interpreted as if one samples 1 balls at a time without replacement.
One natural question is whether 'Y: > TI for odd 1 (for even 1 the chains are reducible). It is easy to see that for the second largest eigenvalue
n I 2 1 1 2 A(, ) ( 'Yi )=( l - -) 2 l >A(,)('Yl)=(K,(1,n)(= 1 - ; , € o r l < ( o d d ) l s -
n
However, numerical results show that in general 'V:y'V/. A limited numerical study suggests, however, that the majorization result may hold for the corre- sponding chains on the state space (0, 1, . . . , n} , with states given by the length of the component vector in 2;.
Recall from Section 2 that the eigenvalues of these "lumped" chains are the same as those of the chains on a;, but with all multiplicities 1.
Conjecture 1. The (n + 1)-vector vector ( K , ( x , n), x = 0, 1, . . . , n ) , for odd 15 -
(3.9), for odd I = n f 2 ,
2 ' The above conjecture holds when (odd) I = n / 2 . To see this, note that by
(-1)"/*(x - l)(x - 3 ) . . 1 ( n - l ) ( n - 3 ) - - . ( n - x + 1 ) '
for even x
10 7 for odd x
1 So, the set { K / ( x , n) , x = 0, 1, . . . , n} consists of *a!" ' , i = 0 , 1, . . . , - 4 ( n - 2 ) , n 2 and - copies of 0, where a:) = 1, and
(2i - 1)(2i - 3) - * 1 1 4 i = l , . . . , - ( n - 2 ) a?' = I ~ , ( 2 i , n)l = ( n - I)(n - 3) * (n - 2i + 1 ) '
It suffices to show that the vector (a!) , . . . , is weakly majorized by the
vector (bf), . . . , b(n-2 ) ,4 ) ( n ) where bj"' = (1 - :) . In fact, we shall prove a?' 5 bj"' for all i. Note that both a!"' and bj"' are decreasing in i. Moreover, the ratio
n l 2
MARKOV CHAINS ON HYPERCUBES
2i - 1 n - 2 i + 1
al")/& =
33
is increasing in i, and
is decreasing in i. We shall argue that u[:L2)/4 < b{:)-2),4, which then implies a,(.) 5 b y ) for all i 5 (n - 2) /4. By Robbins' [14] refinement of Stirling's formula, it can be shown that
< ( - i n + +)10g2+yn1og ( 1- - :) + (;-y,,)log(l+:), f o r n 2 6 .
So, for n 2 6,
1 2 zynlog( l+4/n) - -10g2
' y n ' j 2 4 (--)-Zlog2 1
2 3 where we have used the inequality log(1 + x) 2 -x for 0 I x 5 213. This com-
pletes the proof. rn
(2) Instead of placing each drawn ball into the opposite urn, we now let each ball be placed independently into one of the two urns with equal probability. Then the Markov chain assofiated with sampling 1 balls with replacement has
transition matrix while the chain corresponding to sampling 1 balls without replacement has transition matrix
- I + - "v ̂(; ; 1)
P' = 2-I( ; ) I + 2-I( :> "v̂ , + 2-'( ;) "v; + - * * + 2-I( ;) "v̂ ; .
1 ' Conjecture 2. ( : I + 2 V,) weakly majorizes PI for 1 = 1, 2, . . . , n.
Note that the trace of PI = 2-' trace (I) = 2"-' and the trace of equals
34 KARLIN, LINDQVIST, AND YAO
I
= 2"E( 1 - $) (apply Jensen's inequality)
B2(1---)=2 EX I n - l , 1 2 1
where X is a binomial random variable with parameters (n, i). Conjecture 2 holds for 1 = n, since PI has eigenvalues
2- i: (Y)K;(X, n) , i = O
which are all 0 €or x = 1, 2, . . . , n. 1 1 - I + - Vl weakly majorizes PI. Note that 2 2
While Conjecture 2 remains open, we now prove the weaker result that
odd c 1 5 1 ( ; ) I ( ; ) = even c i s 1 ( ; ) I ( ; ) = ; * For either (I < n) or (I = n and n even), since V , > Ti, odd i I I, we have
v1>2 (!) l (I)y . . o d d r s l
(Note that Vl f Vn .) Also clearly, the identity matrix I weakly majorizes
1 1 2 2 Thus - I + - Vl weakly majorizes P,. For the remaining case 1 = n (odd), let
1 y = c 2 - n ( 7 ) + 2-"( ;) > -j
odd i<n
Observe that
y - l ( odd z i<n 2 - n ( ; ) V i + 2 - n ( ; ) v z ] = Y - 1 ( 2 - " [ ( r f ) + ( ; ) ] %
+ z, 2 - n ( ; ) V ; ] 3sodd r<n
< Vl , by Theorems 1.1 and 1.2 ,
and hence PI is weakly majorized by (1 - y ) Z + yVl, which is in turn weakly majorized by - I + - Vl , completing the proof.
1 1 2 2 rn
MARKOV CHAINS ON HYPERCUBES 35
By lumping the 2" states into n + 1 states, we have the following variation of Conjecture 2.
I
Conjecture 3. The vector (( 1 - X) , x = 0, I , . . . , n weakly majorizes (ax, x = 0, 1 , . . . , n) for 1 = 1, 2, . . . , n where
1 I
a, = 2-1 c ~ ~ ( x , n)( i) . i = O
Again, Conjecture 3 holds for 1 = n since a, = 0 for x = 1 , 2, . . . , n. But we do not know whether
which is a necessary condition for the conjecture to hold. (3) A natural generalization of Theorem 1.1 is the following
Conjecture 4. "u; > TI, for odd 1, 1' < n with 1 - - > 1 - - . I II I ' 'II We remark that when n is even, 'Yl and Vn-, (odd 1 ) have the same set of
eigenvalues. By lumping the 2" states into n + 1 states, we have the following variation.
Conjecture 5. The vector (Kl (x , n) , x = 0 , 1, . . . , n) majorizes (K, . (x , n ) , x = 0,
1, . . . , n) for odd I , I ' < n with I I - 5 I > 11' - 5 I. Finally, numerical results on Yl for n up to 12 suggest
Conjecture 6. Y, > Y,,spn-2 for 1 < n - 2.
REFERENCES
[l] D. Aldous, Random walk on finite groups and rapidly mixing Markov chains, in Seminaire de Probabilities XVZZ. Lecture notes in Mathematics 986, Springer, New York, 1983, pp. 2113-2297.
[2] D. J. Aldous, Lower bounds for covering times for reversible Markov chains and random walks on graphs, J . Theor. Probab. 2 , 91-100 (1989).
[3] T. Ando, Majorization, double stochastic matrices, and comparison of eigenvalues, Linear Algebra Appl., 118, 163-248 (1989).
[4] A. Z. Broder and A. R. Karlin, Bounds on the cover time, J. Theor. Probab., 2 ,
[5] P. Diaconis, R. L. Graham, and J. A. Morrison, Asymptotic analysis of a random walk on a hypercube with many dimensions, Random Strut. Alg., 1, 51-72 (1990).
101-120 (1989).
36 KARLIN, LINDQVIST, AND YAO
[6] S. Karlin and J. McGregor, Ehrenfest urn modles, J . Appl. Probab. 2 , 352-375 (1965).
[7] S. Karlin and J. McGregor, On Some Stochastic Models in Genetics, Conference on Stochastic Models in Medicine and Biology, University of Wisconsin Press, Madison,
[8] S. Karlin and J. McGregor, Linear growth models with many types and multi- dimensional Hahn polynomials, in Theory and Application of Special Functions, (R. Askey, Ed.), Academic Press, New York, 1975, pp. 261-288.
[9] S. Karlin and H. M. Taylor, A First Course in Stochastic Processes, Academic Press, New York, 1975.
[lo] S. Karlin and Y. Rinott, Entropy inequalities for classes of probability distributions 11: The multivariate case, Adv. Appl. Probab. 13, 325-351 (1981).
(111 J. G. Kemeny and J. L. Snell, Finite Markov Chains, Van Nostrand, Princeton, NJ, 1960.
[12] A. W. Marshall and I. Olkin, Inequalities: Theory of Majorization and its Applica- tions, Academic Press, New York, 1979.
[13] P. R. Milch, A multi-dimensional linear growth birth and death process, Ann. Math. Stat., 39, 727-754 (1968).
[14] H. Robbins, A remark on Stirling’s formula, Am. Math. Mon., 62, 26-29 (1955). [15] G. Szego, Orthogonal Polynomials, Amer. Math. SOC. Colloquium Publ. 23, 1939.
WI, 1964, pp. 245-279.
Received September 26, 1991 Revised July 20, 1992