Let (X1, X2) have a bivariate normal distribution with mean
vector (µ1, µ2), variance σ 12 for X1 and σ 2
2 for X2 and correlation cor(X1, X2) = ρ.
(a) Write down the joint density f(x1, x2).
(b) Find the marginal distribution f(x1)
(c) Find the conditional distribution f(x1 | x2) and the mean
and variance of the conditional distribution.
(d) Obtain the likelihood equations and calculate the MLE for
µ1, µ2, σ12 , σ2 2 , ρ.
Let X1, …
, Xn be independent where Xi is normally
distributed with unknown mean µ and unknown variance o2 > 0.
Find the likelihood ratio test for testing that µ = 0
against
−∞ < µ < ∞.
Let X1, … , Xn be independent where
Xi is normally distributed with unknown mean µ and
unknown variance o2 > 0.
Find the likelihood ratio test for testing that µ = 0
against
−∞ < µ < ∞.
Let X1 and X2 be independent standard
normal variables X1 ∼ N(0, 1) and X2 ∼ N(0,
1).
1) Let Y1 = X12 +
X12 and Y2 =
X12− X22 . Find the
joint p.d.f. of Y1 and Y2, and the marginal
p.d.f. of Y1. Are Y1 and Y2
independent?
2) Let W =
√X1X2/(X12
+X22) . Find the p.d.f. of W.
Suppose that X1,X2,X3,X4 are independent random variables with
common mean E(Xi) =μ and variance Var(Xi) =σ2. LetV=X2−X3+X4 and
W=X1−2X2+X3+ 4X4.
(a) Find E(V) and E(W).
(b) Find Var(V) and Var(W).
(c) Find Cov(V,W).(
d) Find the correlation coefficientρ(V,W). Are V and W
independent?
Suppose that X1,X2,X3,X4 are independent random variables with
common mean E(Xi) =μ and variance Var(Xi) =σ2. LetV=X2−X3+X4 and
W=X1−2X2+X3+ 4X4.
(a) Find E(V) and E(W).
(b) Find Var(V) and Var(W).
(c) Find Cov(V,W).(
d) Find the correlation coefficientρ(V,W). Are V and W
independent?
Let X1, X2, . . . , Xn iid∼ N (µ, σ2 ).
Consider the hypotheses H0 : µ = µ0 and H1 : µ (not equal)= µ0
and the test statistic (X bar − µ0)/ (S/√ n). Note that S has been
used as σ is unknown.
a. What is the distribution of the test statistic when H0 is
true?
b. What is the type I error of an α−level test of this type?
Prove it.
c. What is...
Q5.LetX1,X2,···Xn be an independent random sample from a
distribution with finite mean µ and finite variance σ2. An estimator
of µ in the form L = c1X1 + c2X2 +···cnXn 2 is called a linear
estimator,where c1,c2,··· ,cn are some known constants.If L is
unbiased,then it is called a linear unbiased estimator.A linear
unbiased estimator that has the minimum variance among all linear
unbiased estimators is called the best linear unbiased estimator
(BLUE). (i) Express E(L) and Var(L) in terms...
Let X1, X2, . . . be a sequence of independent and identically
distributed random variables where the distribution is given by the
so-called zero-truncated Poisson distribution with probability mass
function; P(X = x) = λx/ (x!(eλ − 1)), x = 1, 2,
3...
Let N ∼ Binomial(n, 1−e^−λ ) be another random variable that is
independent of the Xi ’s.
1) Show that Y = X1
+X2 + ... + XN has a Poisson distribution
with mean nλ.