Procesos Estoc´ asticos 2015. 1 Practica 5. Cadenas de Markov

Procesos Estocásticos 2015.
1
Practica 5. Cadenas de Markov
Procesos Estocásticos 2015.
1. Pruebe que si µ es reversible para Q, entonces µ es invariante para Q.
2. Pruebe que si µ es invariante para Q, entonces la matriz Q∗ definida por
Q∗ (x, y) =
µ(y)
Q(y, x).
µ(x)
(1)
es una matriz de transición (filas suman 1)
3. Sabemos que la probabilidad que un ciclo sea recorrido en el sentido directo por el proceso Xn
es igual a la probabilidad que sea recorrido en el sentido reverso por Xn∗ . Use ese resultado para
demostrar que T a,a y (T ∗ )a,a tienen la misma distribución.
4. Calcule las medidas invariantes para las cadenas de Markov de los ejemplos de las teóricas.
5. Pruebe que la medida µ definida al discutir el paseo asimétrico en el cı́rculo es invariante para Q
y calcule la matriz reversa Q∗ en relación a µ. Este es un ejemplo de una cadena de Markov que
tiene una medida invariante que no es reversible.
6. Sea Q una matriz irreducible en un espacio finito G. Sea
X
1
µ(y) :=
P (Xnx = y, T x→x > n)
E(T x→x ) n≥0
(2)
O sea, µ(y) es proporcional al número esperado de visitas al estado y antes que la cadena vuelva a
x. Muestre que µ es invariante para Q. Esta es una prueba alternativa para el Teorema de Perron
Frobenious.
7. Birth and death process The birth and death process is a Markov chain on N with transition
matrix:
Q(x, x + 1) = q(x) = 1 − Q(x, x − 1), if x ≥ 1,
Q(0, 1) = q(0) = 1 − Q(0, 0),
where (q(x))x∈N is a sequence of real numbers contained in (0, 1). Under which condition on
(q(x))x∈N the process accepts an invariant measure? Specialize to the case q(x) = p for all x ∈ N.
Pista: plantee las ecuaciones de balance y busque soluciones normalizables.
8. Método de Monte Carlo One of the most popular Monte Carlo methods to obtain samples of
a given probability measure consists in simulate a Markov chain having the target measure as
invariant measure. To obtain a sample of the target measure from the trajectory of the Markov
chain, one needs to let the process evolve until it “attains equilibrium”. We propose a chain for
the simulation.
i) Let N be a positive integer and µ a probability measure on {1, . . . , N }. Consider the following
transition probabilities on {1, . . . , N }: for y 6= x,
µ(y)
1
N − 1 µ(x) + µ(y)
X
Q1 (x, x) = 1 −
Q1 (x, y)
Q1 (x, y) =
y6=x
(3)
(4)
Procesos Estocásticos 2015.
2
(Choose uniformly a state y different of x and with probability µ(y)/(µ(x) + µ(y)) jump to y; with
the complementary probability stay in x.)
µ(y)
1
1{µ(x) ≤ µ(y)} +
1{µ(x) > µ(y)}
(5)
Q2 (x, y) =
N −1
µ(x)
X
Q2 (x, x) = 1 −
Q2 (x, y).
(6)
y6=x
(Choose uniformly a state y different of x and if µ(x) < µ(y), then jump to y; if µ(x) ≥ µ(y), then
jump to y with probability µ(y)/µ(x); with the complementary probability stay in x.)
Check if µ is reversible with respect to Q1 and/or Q2 .
ii) Let G = {0, 1}N and µ be the uniform distribution on G (that is, µ(ζ) = 2−N , for all ζ ∈ G).
Define a transition matrix Q on G satisfying the following conditions:
a) Q(ξ, ζ) = 0, if d(ξ, ζ) ≥ 2, where d is the Hammings’ distance, introduced in the previous
chapter;
b) µ is reversible with respect to Q.
iii) Simulate the Markov chain corresponding to the chosen matrix Q. How would you determine
empirically the moment when the process attains equilibrium? Hint: plot the relative frequency of
visit to each site against time and wait this to stabilize. Give an empiric estimate of the density of
ones. Compare with the true value given by µ(1), where µ is the invariant measure for the chain.
iv) Use the Ehrenfest model to simulate the Binomial distribution with parameters
1
2
and N .
9. (i) Prove that if Q is the matrix
Q=
p
1−p
1−q
q
then Qn converges to the matrix

1−q
 (1 − p) + (1 − q)



1−q
(1 − p) + (1 − q)

1−p
(1 − p) + (1 − q) 



1−p
(1 − p) + (1 − q)
(7)
(8)
ii) For the same chain compute E(T 1→1 ), where
T 1→1 = ı́nf{n ≥ 1 : Xn1 = 1}.
iii) Establish a relationship between items (i) and (ii).
10. Let G = N and Q be a transition matrix defined as follows. For all x ∈ N
Q(0, x) = p(x) and
Q(x, x − 1) = 1 if x ≥ 1,
where p is a probability measure on N. Let (Xn0 )n∈N be a Markov chain with transition matrix Q
and initial state 0.
i) Give sufficient conditions on p to guarantee that Q has at least one invariant measure.
ii) Compute E(T 1→1 ).
iii) Establish a relationship between items (i) and (ii).
Procesos Estocásticos 2015.
3
11. Stirring process The stirring process is a Markov chain on the hypercube G = {0, 1}N defined
by the following algorithm. Let PN be the set of all possible permutations of the sequence
(1, 2, · · · , N ), that is, the set of bijections from {1, 2, · · · , N } into itself. Let π be an element
of PN . Let Fπ : G → G be the function defined as follows. For all ξ ∈ G
Fπ (ξ)(i) = ξ(π(i)).
In other words, Fπ permutates the values of each configuration ξ assigning to the coordinate i the
former value of the coordinate π(i).
Let (Π1 , Π2 , · · · ) be a sequence of iid random variables on PN . The stirring process (ηnζ )n∈N ) with
initial state ζ is defined as follows:
(
ζ,
if n = 0;
(9)
ηnζ =
ζ
FΠn (ηn−1 ), if n ≥ 1
i) Show that the stirring process is not irreducible (it is reducible!).
ii) Assume that the random variables Πn have uniform distribution on PN . Which are all the
invariant measures for the stirring process in this case?
iii) Let VN be the set of permutations that only change the respective positions of two neighboring
points of (1, 2, · · · , N ). A typical element of VN is the permutation π k , for k ∈ {1, 2, · · · , N },
defined by:

i,
if i 6= k, i 6= k + 1

k
(10)
π (i) = k + 1, if i = k,


k,
if i = k + 1.
In the above representation, the sum is done “module N ”, that is, N + 1 = 1. Assume that the
random variables Πn are uniformly distributed in the set VN . Compute the invariant measures of
the stirring process in this case.
iv) Compare the results of items (ii) and (iii).
12. Pruebe que si µ es invariante para Q, entonces
X
µ(b) =
µ(x)Qk (x, b)
(11)
x∈G
para todo k ≥ 1.
13. Pruebe que los acoplamientos libre e independiente coinciden después del instante de encuentro:
n ≥ τ a,b
implica
Xna = Xnb
14. Independent coalescing coupling. Let Q be a transition
Define the matrix Q̄ on G × G as follows


Q(a, x)Q(b, y),
Q̄((a, b), (x, y)) = Q(a, x),


0,
(12)
matrix on the finite or countable set G.
if a 6= b;
if a = b and x = y;
if a = b and x =
6 y.
(13)
Procesos Estocásticos 2015.
4
Verify that Q̄ is a transition matrix. In other words, verify that for all (a, b) ∈ G × G,
X
Q̄((a, b), (x, y)) = 1.
(x.y)∈G×G
Observe that the chain corresponding to Q̄ describes two Markov chains of transition matrix Q
which evolve independently up to the first time both visit the same state. From this moment on,
the two chains continue together for ever.
15. Independent coalescing coupling. Show that the process defined in Example ?? has transition
matrix Q defined by 13.
16. Demuestre que si β(Qk ) > 0 para algun k ≥ 1. Entonces existe una única medida invariante µ y
sup |P (Xna = y) − µ(y)| ≤ (1 − β(Qk ))n/k .
(14)
a,y
17. Determine if the chains presented in during the course are periodic and determine the period. For
those matrices that are aperiodic and irreducible, determine the smallest power k satisfying that
all the entries of Qk are strictly positive.
18. Determine β(Q) and α(Q) for all aperiodic and irreducible chains Q of Exercise 17. In case the
computations become complicate, try to find bounds for α(Q) and β(Q). When α(Q) gives a better
convergence velocity than β(Q)?
19. Let G = {1, 2} and Q be the following transition matrix
1 2 3
3
Q=
2
1
3
3
(a) Show that there exists n̄, such that, for all n ≥ n̄,
0, 45 ≤ Qn (1, 2) ≤ 0,55 and
0, 45 ≤ Qn (2, 2) ≤ 0,55.
Find bounds for n̄.
(b) Obtain similar results for Qn (1, 1) and Qn (2, 1).
Download PDF