Question

In: Statistics and Probability

Let Xn is a simple random walk (p = 1/2) on {0, 1, · · ·...

Let Xn is a simple random walk (p = 1/2) on {0, 1, · · · , 100} with absorbing boundaries. Suppose X0 = 50. Let T = min{j : Xj = 0 or N}. Let Fn denote the information contained in X1, · · · , Xn.

(1) Verify that Xn is a martingale.

(2) Find P(XT = 100).

(3) Let Mn = X2 n − n. Verify that Mn is also a martingale.

(4) It is known that Mn and T satisfy the assumptions of Optimal Sampling Theorem and hence E(MT ) = E(M0). Use this fact to find out E(T).

Solutions

Expert Solution

Definition 7.19. We say a sub-probability, π : S → [0, 1] , is invariant if πP = π, i.e. X i∈S π (i) Pij = π (j) for all j ∈ S. (7.16) An invariant probability, π : S → [0, 1] , is called an invariant distribution. Example 7.20. If # (S) < ∞ and p : S × S → [0, 1] is a Markov transition matrix with column sums adding up to 1 then π (i) := 1 #(S) is an invariant distribution for p. In particular, if p is a symmetric Markov transition matrix (p (i, j) = p (j, i) for all i, j ∈ S), then the uniform distribution π is an invariant distribution for p. Example 7.21. If S is a finite set of nodes and G be an undirected graph on S, i.e. G is a subset of S × S such that 1. (x, x) ∈/ G for all x ∈ S, 2. if (x, y) ∈ G, then (y, x) ∈ G [the graph is undirected], and 3. for all x ∈ G, the set Sx := {y ∈ S : (x, y) ∈ G} is not empty. [We are not allowing for any isolated nodes in our graph.] Let ν (x) := # (Sx) = X y∈S 1(x,y)∈G be the valence of G at x. The random walk on this graph is then the Markov chain on S with Markov transition matrix, p (x, y) := 1 ν (x) 1Sx (y) = 1 ν (x) 1(x,y)∈G. Notice that X x∈S ν (x) p (x, y) = X x∈S 1(x,y)∈G = X x∈S 1(y,x)∈G = ν (y). Thus if we let Z := P x∈S ν (x) and π (x)


Related Solutions

4. (Reflected random walk) Let {Xn|n ≥ 0} be as in Q6. Show that Xn+1 =...
4. (Reflected random walk) Let {Xn|n ≥ 0} be as in Q6. Show that Xn+1 = X0 + Zn+1 − Xn m=0 min{0, Xm + Vm+1 − Um+1}, where Zn = Xn m=1 (Vm − Um), n ≥ 1. Q5. (Extreme value process) Let {In|n ≥ 0} be an i.i.d. sequence of Z-valued random variables such that P{I1 = k} = pk, k ∈ Z and pk > 0 for some k > 0. Define Xn = max{I1, I2, ·...
Q1. Let {Xn : n ≥ 0} denote the random walk on 9-cycle. Express it as...
Q1. Let {Xn : n ≥ 0} denote the random walk on 9-cycle. Express it as a random walk on a group (G, ·) with transition probabilities given by pxy = µ(y · x −1 ) for an appropriate distribution µ on G. Q2. Consider the stochastic process {Xn|n ≥ 0}given by X0 = 1, Xn+1 = I{Xn = 1}Un+1 + I{Xn 6= 1}Vn+1, n ≥ 0, where {(Un, Vn)|n ≥ 1} is an i.i.d. sequence of random variables such...
Let b > 0 be an integer. Find the probability that a symmetric simple random walk...
Let b > 0 be an integer. Find the probability that a symmetric simple random walk started from 0 visits b the first time in the nth step. Hint: Draw a picture, and try to describe the requirements that the path consisting the first n−1 steps should satisfy. The Reflection principle (or a related result) should be helpful after that.
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class...
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class we have shown a central limit theorem, ¯ Xn /pn )N(0,1), as n!1 , (1) with the assumption E(X1) = 0. Using (1), we now prove the theorem for a more general E(X1)=µ6=0 case. Now suppose X1,...,Xn are i.i.d. random variables with mean µ6= 0 and variance 2. (a) Show that for dummy random variables Yi = Xi µ, E(Yi) = 0 and V...
Let {xn} be a real summable sequence with xn ≥ 0 eventually. Prove that √(Xn*Xn+1) is...
Let {xn} be a real summable sequence with xn ≥ 0 eventually. Prove that √(Xn*Xn+1) is summable.
Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2,...
Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2, 3}, and transition probability matrix (pij ) given by   2 3 1 3 0 0 1 3 2 3 0 0 0 1 4 1 4 1 2 0 0 1 2 1 2   Determine all recurrent states. Q3. Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2} and transition probability matrix (pij...
11.4 Let p be a prime. Let S = ℤ/p - {0} = {[1]p, [2]p, ....
11.4 Let p be a prime. Let S = ℤ/p - {0} = {[1]p, [2]p, . . . , [p-1]p}. Prove that for y ≠ 0, Ly restricts to a bijective map Ly|s : S → S. 11.5 Prove Fermat's Little Theorem
Let x = (x1,...,xn) ∼ N(0,In) be a MVN random vector in Rn. (a) Let U...
Let x = (x1,...,xn) ∼ N(0,In) be a MVN random vector in Rn. (a) Let U ∈ Rn×n be an orthogonal matrix (UTU = UUT = In) and nd the distribution of UTx. Let y = (y1,...,yn) ∼ N(0,Σ) be a MVN random vector in Rn. Let Σ = UΛUT be the spectral decomposition of Σ. (b) Someone claims that the diagonal elements of Λ are nonnegative. Is that true? (c) Let z = UTy and nd the distribution of...
1. Let X be random variable with density p(x) = x/2 for 0 < x<...
1. Let X be random variable with density p(x) = x/2 for 0 < x < 2 and 0 otherwise. Let Y = X^2−2. a) Compute the CDF and pdf of Y. b) Compute P(Y >0 | X ≤ 1.8).
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine...
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine a) the MLE ? of ?. b) E(? ̂). c) the asymptotic variance of the MLE of ?. d) the MLE of SD(Xi ) = √ ?.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT