Question

In: Math

The observations are Y1, . . . , Yn. The model is Yi = βxi +...

The observations are Y1, . . . , Yn. The model is Yi = βxi + i , i = 1, . . . , n, where (i) x1, . . . , xn are known constants, and (ii) 1, . . . , n are iid N(0, σ2 ). Find the MLEs of β and σ^ 2 . Are they jointly sufficient for β and σ ^2 ?

Solutions

Expert Solution


Related Solutions

Let Y1,Y2,...,Yn be a Bernoulli distributed random sample with P(Yi = 1) = p and P(Yi...
Let Y1,Y2,...,Yn be a Bernoulli distributed random sample with P(Yi = 1) = p and P(Yi = 0) = 1−p for all i. (a) Prove that E(¯ Y ) = p and V (¯ Y ) = p(1−p)/n2, for the sample mean ¯ Y of Y1,Y2,...,Yn, and find a sufficient statistic U for p and show it is sufficient for p. (b) Find MVUE for p and show it is unbiased for p.
Suppose that the random variable Y1,...,Yn satisfy Yi = ?xi + ?i i=1,...,n. where the set...
Suppose that the random variable Y1,...,Yn satisfy Yi = ?xi + ?i i=1,...,n. where the set of xi are fixed constants and ?i are iid random variables following a normal distributions of mean zero and variance ?2. ?a (with a hat on it) = ?i=1nYi xi  /  ?i=1nx2i is unbiased estimator for ?. The variance is  ?a (with a hat on it) = ?2/  ?i=1nx2i . What is the distribation of this variance?
1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
If you conduct and experiment 1500 times independently, i=1,2,3,...1500. Let y1, y2.... yN be i.i.d observations...
If you conduct and experiment 1500 times independently, i=1,2,3,...1500. Let y1, y2.... yN be i.i.d observations from this experiment, yi=1 if heads with a probability of β; yi=0 if tails with a probability of 1-β. If you get 600 heads and 900 tails, what is the βMLE? A. 0.4 B. 0.5 C. 0.6 D. 0.7
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1,...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1, . . . , Yn) is a sufficient statistic for θ. Find a MVUE (Minimal Variance Unbiased Estimator) for θ.
True or False (a) For any distribution, the sample data, Y1, . . . Yn, is...
True or False (a) For any distribution, the sample data, Y1, . . . Yn, is always a sufficient statistic. (b) Biased estimators are always preferred to unbiased estimators. (c) Maximum likelihood estimators are always unbiased.
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on...
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on the interval (θ - 2, θ). a) Show that Ȳ is a biased estimator of θ. Calculate the bias. b) Calculate MSE( Ȳ). c) Find an unbiased estimator of θ. d) What is the mean square error of your unbiased estimator? e) Is your unbiased estimator a consistent estimator of θ?
Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with...
Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with the variables Xi normally distributed with mean μ1 and variance σ12 and the variables Yi normally distributed with mean μ2 and variance σ22. The difference between the sample means, X − Y, is then a linear combination of m + n normally distributed random variables and, by this theorem, is itself normally distributed. (a) Find E(X − Y). (b) Find V(X − Y). (c)...
Suppose Y1, . . . , Yn are independent random variables with common density fY(y) =...
Suppose Y1, . . . , Yn are independent random variables with common density fY(y) = eμ−y y > μ. Derive a 95% confidence interval for μ. Find the MLE for μ to use as the estimator
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is...
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is known and fixed and the unknown mean μ has a Normal (0,σ2/m) prior, where m is a given constant. Give a 95% credible interval for μ.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT