Question

In: Statistics and Probability

Let x = (x1,...,xn) ∼ N(0,In) be a MVN random vector in Rn. (a) Let U...

Let x = (x1,...,xn) ∼ N(0,In) be a MVN random vector in Rn. (a) Let U ∈ Rn×n be an orthogonal matrix (UTU = UUT = In) and nd the distribution of UTx. Let y = (y1,...,yn) ∼ N(0,Σ) be a MVN random vector in Rn. Let Σ = UΛUT be the spectral decomposition of Σ.
(b) Someone claims that the diagonal elements of Λ are nonnegative. Is that true?
(c) Let z = UTy and nd the distribution of z. (d) What is the cov(zi,zj) for i 6= j? Here, zi is the ith component of z = (z1,...,zn). What is the var(zi)?
(e) Are the components of z independent? (f) Let a = (a1,...,an) ∈Rn be a xed (nonrandom) vector, and nd the distribution of aTz.
(g) Assume that Λii > 0 for all i. (Here, Λii is the ith diagonal entry of Λ.) Can you choose a from part (f) to make var(aTz) = 1? If so, specify one such a. (h) Let u1,u2 ∈Rn be the rst and second columns of U. Find the joint distribution of 2uT 2 y−uT 1 y uT 1 y ∈R2. Note that this is a two-dimensional vector.

Solutions

Expert Solution

(a)

(b)


Related Solutions

Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine...
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine a) the MLE ? of ?. b) E(? ̂). c) the asymptotic variance of the MLE of ?. d) the MLE of SD(Xi ) = √ ?.
Let X1, ..., Xn be a random sample from U(0, 3). Recall that this means fXi...
Let X1, ..., Xn be a random sample from U(0, 3). Recall that this means fXi (xi) = 1 3 , 0 < xi < 3, i = 1, ..., n, and that all Xi are mutually independent. Let X(1) ≤ X(2) ≤ ... ≤ X(n) be the order statistics of the random sample. Denote Y1 = X(1). • Derive FXi (xi). • Find FY1 (y) = P(Y1 ≤ y). Hint: Use the complement rule of probability. That is, P(Y1...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi is geometric with respective parameter pi. It is known that the mean of X is equal to μ, where μ > 0. Show that the variance of X is minimized if the pi's are all equal to n/μ. 2. Suppose that each Xi is Bernoulli with respective parameter pi. It is known that the mean of X is equal to μ, where μ >...
Let X1, X2, . . . , Xn be a random sample of size n from...
Let X1, X2, . . . , Xn be a random sample of size n from a Poisson distribution with unknown mean µ. It is desired to test the following hypotheses H0 : µ = µ0         versus     H1 : µ not equal to µ0 where µ0 > 0 is a given constant. Derive the likelihood ratio test statistic
Let X1, ..., Xn be a random sample of size n from an exponential ditribution with...
Let X1, ..., Xn be a random sample of size n from an exponential ditribution with parameter lambda . Let Di=(n-i+1)[X(i)-X(i-1)] i=1,.... n , where X(0)=0 be the normalized spacings. Using Jacobians, derive the joint distribution of D1,....., Dn.
Let X1, X2, X3, . . . be independently random variables such that Xn ∼ Bin(n,...
Let X1, X2, X3, . . . be independently random variables such that Xn ∼ Bin(n, 0.5) for n ≥ 1. Let N ∼ Geo(0.5) and assume it is independent of X1, X2, . . .. Further define T = XN . (a) Find E(T) and argue that T is short proper. (b) Find the pgf of T. (c) Use the pgf of T in (b) to find P(T = n) for n ≥ 0. (d) Use the pgf of...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is bounded and monotone. Find the limit. Prove by induction
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class...
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class we have shown a central limit theorem, ¯ Xn /pn )N(0,1), as n!1 , (1) with the assumption E(X1) = 0. Using (1), we now prove the theorem for a more general E(X1)=µ6=0 case. Now suppose X1,...,Xn are i.i.d. random variables with mean µ6= 0 and variance 2. (a) Show that for dummy random variables Yi = Xi µ, E(Yi) = 0 and V...
4. (Reflected random walk) Let {Xn|n ≥ 0} be as in Q6. Show that Xn+1 =...
4. (Reflected random walk) Let {Xn|n ≥ 0} be as in Q6. Show that Xn+1 = X0 + Zn+1 − Xn m=0 min{0, Xm + Vm+1 − Um+1}, where Zn = Xn m=1 (Vm − Um), n ≥ 1. Q5. (Extreme value process) Let {In|n ≥ 0} be an i.i.d. sequence of Z-valued random variables such that P{I1 = k} = pk, k ∈ Z and pk > 0 for some k > 0. Define Xn = max{I1, I2, ·...
Suppose that X1,. . . , Xn is an m.a. of a distribution U (0, θ]....
Suppose that X1,. . . , Xn is an m.a. of a distribution U (0, θ]. (a) Find the most powerful test of size α to test H0: θ = θ0 vs Ha: θ = θa, where θa <θ0. (b) Is the test obtained in part (a) the UMP (α) to test H0: θ = θ0 vs Ha: θ <θ0 ?. (c) Find the most powerful test of size α to test H0: θ = θ0 vs Ha: θ =...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT