Question

In: Statistics and Probability

Suppose we have three sets of random variables Wh, Xi, and Yj (for h= 1,...,k, i=...

Suppose we have three sets of random variables Wh, Xi, and Yj (for h= 1,...,k, i= 1,...,m, and j= 1,...,n) all of which are mutually independent. Assume that the three sets of random variables are all normally distributed with different means but the same standard deviation. The MLE for the means are just the group means and the MLE for the variance is the mean of the squared errors of the observations from the groups when taking into account the group means. Write a function to fit the this model to three observed data vectors w, x, y and return both the MLE and log-likelihood evaluated at the MLE. Use the commands

data("iris")

w = iris$Sepal.Width[iris$Sepecies=="setosa"]

x = iris$Sepal.Width[iris$Sepecies=="versicolor"]

y = iris$Sepal.Width[iris$Sepecies=="virginica"]

to make some data to analyze using your function. Compare the results from analyzing the data with the model for difference means to the results from analyzing the data when it would be assumed that the means are all the same. Comment on your results.

Solutions

Expert Solution

A statistic is a property of a sample, whereas a parameter is a property of a population. Often it’s natural to estimate a parameter θ (such as the population mean µ) by the corresponding property of the sample (here the sample mean X). Note that θ may be a vector or more complicated object. Unobserved quantities are treated mathematically as random variables. Potentially observable quantities are usually denoted by capital letters (Xi , X, Y etc.) Once the data have been observed, the values taken by these random variables are known (Xi = xi , X = x etc.) Unobservable or hypothetical quantities are usually denoted by Greek letters (θ, µ, σ 2 etc.), and estimators are often denoted by putting a hat on the corresponding symbol (θb, µb, σb 2 etc.) Nearly all statistics books use the above style of notation, so it will be adopted in these notes. However, sometimes I shall wish to distinguish carefully between knowns and unknowns, and shall denote all unknowns by capitals. Thus Θ represents an unknown parameter vector, and θ represents a particular assumed value of Θ. This is especially useful when considering probability distributions for parameters; one can then write fΘ(θ) and Pr(Θ = θ) by exact analogy with fX(x) and Pr(X = x). The set of possible values for a RV X is called its sample space ΩX. Similarly the parameter space ΩΘ is the set of possible values for the parameter Θ.

Fix the size of the test to be α. Let A be a positive constant and C0 a subset of the sample space satisfying 1. Pr(X ∈ C0 | θ = θ0) = α, 2. X ∈ C0 ⇐⇒ L(θ0; x) L(θ1; x) = f(x|θ0) f(x|θ1) ≤

To every bounded Bore1 set, B, of R there corresponds a random variable X(B) with E ) X(B) j2 < co. (2) If B, , BT ,... are disjoint Bore1 sets whose union, B, is bounded, then X(B) = X(B,) + X(B,) $- e-e

The random measures considered in this paper are assumed to be real and satisfy EX(B) = 0 for every Bore1 set B. A random measure has independent components if for every collection of disjoint Bore1 sets B, ,..., B, , the random variables X(B,),..., X(B,) are mutually independent. If X has independent components, the set function V defined for every bounded Bore1 set B by V(B) = E 1 X(B) I2 is a Bore1 measure. A random measure has stationary components if, for every collection of bounded Bore1 sets B, ,..., B,, , the joint distribution of the family X(T + BJ,..., X(T + B,) is independent of 7. For random measures with independent components, stationarity is equivalent to requiring that X(B) and X(T + B) be identically distributed for every B and every T. In the stationary case V is a Haar measure and is equal to Lebesgue measure on the Bore1 sets to within a nonnegative multiplicative constant. The points of R for which E 1 X((t)) I2 > 0 are called singul


Related Solutions

You have three independent uniform random variables Xi on [0,1] for i=1,2,3. calculate (a) P(all of...
You have three independent uniform random variables Xi on [0,1] for i=1,2,3. calculate (a) P(all of them are less than 1/2) (b) P(at least one of them is less than 1/2) (c) the conditional probability P(all of them are less than 1/2 | at least one of them is less than 1/2) (d) the mean and the variance of S = X1 + X2 + X3 (e) P(the value of X2 lies between the values of the other two random...
Investigate the following theorems (h) For sets A, B and C we have i. A\(B ∪...
Investigate the following theorems (h) For sets A, B and C we have i. A\(B ∪ C) = (A\B) ∩ (A\C), ii. A\(B ∩ C) = (A\B) ∪ (A\C), iii. A ̸= B if and only if (A\B) ∪ (B\A) ̸= ∅, iv. A ∪ B ⊆ C if and only if A ⊆ C and B ⊆ C. What happens in the extreme case(s) where some (or all) sets are empty?
1. We have the data as follows. There are three independent variables and three dependent variables...
1. We have the data as follows. There are three independent variables and three dependent variables (You may use the following table to solve this problem) x y 3 11 5 6 7 4 Total 15 21 a) Calculate b1 and b0, and write the equation of the least squares line. b) Determine the values of SSE and SST. c) Calculate the standard error. d) Find the rejection point for the t statistic at α = .05 and test H0:...
Suppose Xi and Yi are all independent (i=1,2,3), where the three Xi are iid and follow...
Suppose Xi and Yi are all independent (i=1,2,3), where the three Xi are iid and follow an Exponential distribution with rate r, while the three Yi are also iid but follow a Normal(μ, σ2) distribution. (a) Write down the joint pdf for the random vector (X1,X2,X3,Y1,Y2,Y3). (b) Find the expected value of the product X1 Y1, i.e., E(X1 Y1), and find Cov(X2, X3).
Suppose Xi and Yi are all independent (i=1,2,3), where the three Xi are iid and follow...
Suppose Xi and Yi are all independent (i=1,2,3), where the three Xi are iid and follow an Exponential distribution with rate r, while the three Yi are also iid but follow a Normal(µ, σ2) distribution. (a) Write down the joint pdf for the random vector (X1,X2,X3,Y1,Y2,Y3). (b) Find the expected value of the product X1 Y1, i.e., E(X1 Y1), and find Cov(X2, X3).
Suppose that the random variable Y1,...,Yn satisfy Yi = ?xi + ?i i=1,...,n. where the set...
Suppose that the random variable Y1,...,Yn satisfy Yi = ?xi + ?i i=1,...,n. where the set of xi are fixed constants and ?i are iid random variables following a normal distributions of mean zero and variance ?2. ?a (with a hat on it) = ?i=1nYi xi  /  ?i=1nx2i is unbiased estimator for ?. The variance is  ?a (with a hat on it) = ?2/  ?i=1nx2i . What is the distribation of this variance?
1.) Suppose we have the following values for a dependent variable, Y, and three independent variables,...
1.) Suppose we have the following values for a dependent variable, Y, and three independent variables, X1, X2, and X3. The variable X3 is a dummy variable where 1 = male and 2 = female:X X1 X2 X3 Y 0 40 1 30 0 50 0 10 2 20 0 40 2 50 1 50 4 90 0 60 4 60 0 70 4 70 1 80 4 40 1 90 6 40 0 70 6 50 1 90 8...
Suppose that X1,X2,X3,X4 are independent random variables with common mean E(Xi) =μ and variance Var(Xi) =σ2....
Suppose that X1,X2,X3,X4 are independent random variables with common mean E(Xi) =μ and variance Var(Xi) =σ2. LetV=X2−X3+X4 and W=X1−2X2+X3+ 4X4. (a) Find E(V) and E(W). (b) Find Var(V) and Var(W). (c) Find Cov(V,W).( d) Find the correlation coefficientρ(V,W). Are V and W independent?
Suppose that X1,X2,X3,X4 are independent random variables with common mean E(Xi) =μ and variance Var(Xi) =σ2....
Suppose that X1,X2,X3,X4 are independent random variables with common mean E(Xi) =μ and variance Var(Xi) =σ2. LetV=X2−X3+X4 and W=X1−2X2+X3+ 4X4. (a) Find E(V) and E(W). (b) Find Var(V) and Var(W). (c) Find Cov(V,W).( d) Find the correlation coefficientρ(V,W). Are V and W independent?
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi is geometric with respective parameter pi. It is known that the mean of X is equal to μ, where μ > 0. Show that the variance of X is minimized if the pi's are all equal to n/μ. 2. Suppose that each Xi is Bernoulli with respective parameter pi. It is known that the mean of X is equal to μ, where μ >...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT