108 Chapter 11: Markov chains
(b) Consider a new chain obtained by “unfolding the circle”. Now the states are arranged
as shown below. From state 1 the chain always goes to state 2, and from state 7 the
chain always goes to state 6. Find the new stationary distribution.
Solution:
(a) The symmetry of the chain suggests that the stationary distribution should be
uniform over all the states. To verify this, note that the reversibility condition is satisfied.
So the stationary distribution is (1/7, 1/7, . . . , 1/7).
(b) By the result about random walk on an undirected network, the stationary proba-
bilities are proportional to the degrees. So we just need to normalize (1, 2, 2, 2, 2, 2, 1),
obtaining (1/12, 1/6, 1/6, 1/6, 1/6, 1/6, 1/12).
10.
s
Let X
n
be the price of a certain stock at the start of the nth day, and assume that
X
0
, X
1
, X
2
, . . . follows a Markov chain with transition matrix Q. (Assume for simplicity
that the stock price can never go below 0 or above a certain upper bound, and that it
is always rounded to the nearest dollar.)
(a) A lazy investor only looks at the stock once a year, observing the values on days
0, 365, 2 ·365, 3 ·365, . . . . So the investor observes Y
0
, Y
1
, . . . , where Y
n
is the price after
n years (which is 365n days; you can ignore leap years). Is Y
0
, Y
1
, . . . also a Markov
chain? Explain why or why not; if so, what is its transition matrix?
(b) The stock price is always an integer between $0 and $28. From each day to the next,
the stock goes up or down by $1 or $2, all with equal probabilities (except for days when
the stock is at or near a boundary, i.e., at $0, $1, $27, or $28).
If the stock is at $0, it goes up to $1 or $2 on the next day (after receiving government
bailout money). If the stock is at $28, it goes down to $27 or $26 the next day. If the
stock is at $1, it either goes up to $2 or $3, or down to $0 (with equal probabilities);
similarly, if the stock is at $27 it either goes up to $28, or down to $26 or $25. Find the
stationary distribution of the chain.
Solution:
(a) Yes, it is a Markov chain: given the whole past history Y
0
, Y
1
, . . . , Y
n
, only the
most recent information Y
n
matters for predicting Y
n+1
, because X
0
, X
1
, . . . is Markov.
The transition matrix of Y
0
, Y
1
, . . . is Q
365
, since the kth power of Q gives the k-step
transition probabilities.
(b) This is an example of random walk on an undirected network, so we know the
stationary probability of each node is proportional to its degree. The degrees are
(2, 3, 4, 4, . . . , 4, 4, 3, 2), where there are 29 − 4 = 25 4’s. The sum of these degrees is
110 (coincidentally?). Thus, the stationary distribution is
2
110
,
3
110
,
4
110
,
4
110
, . . . ,
4
110
,
4
110
,
3
110
,
2
110
,
with 25
4
110
’s.
11.
s
In chess, the king can move one square at a time in any direction (horizontally,
vertically, or diagonally).
For example, in the diagram, from the current position the king can move to any of 8
possible squares. A king is wandering around on an otherwise empty 8 × 8 chessboard,
where for each move all possibilities are equally likely. Find the stationary distribution