Question

In: Math

Make a simulation to verify the following theorem: if X1 ∼ N(µ1, (σ 1)^2 ), X2...

Make a simulation to verify the following theorem: if X1 ∼ N(µ1, (σ 1)^2 ), X2 ∼ N(µ2, (σ2)^ 2 ), and X1 and X2 are independent, then X1 + X2 ∼ N(µ1 + µ2, (σ2)^2+ (σ 2 )^2 ).

Solutions

Expert Solution

Above simulation Done in R. You can use any tool to do the same.

R code below for copying and run in R.

#Code Start

x1=rnorm(n=10000,mean = 10,sd = 3)
x2=rnorm(n=10000,mean = 20,sd = 4)
ans=x1+x2
#Now for the result we want the mean to be mu1+mu2=10+20=30
#And stdvariance = sd1^2+sd2^2 = 3^2+4^2 = 25
#Let's verify
mean1=mean(ans)
var1=var(ans)
mean1
var1

#Code end

Hope the above answer has helped you in understanding the problem. Please upvote the ans if it has really helped you. Good Luck!!


Related Solutions

Consider the bivariate random vector x = x1 x2 ∼ N2 µ1 µ2 , σ 2...
Consider the bivariate random vector x = x1 x2 ∼ N2 µ1 µ2 , σ 2 1 ρσ1σ2 ρσ1σ2 σ 2 2 1. Expand the matrix form of the density function to get the usual bivariate normal density involving σ1, σ2, ρ and exponential terms in (x1 1µ1) 2 ,(x1 1µ2) and (x2 2µ2) 2 . 2. Explain what happens in the following scenarios: (a) ρ = 0 (b) ρ = 1 (c) ρ = =1 1
Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1,...
Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1, σ12 ) and Y1, Y2, · · · , Yn be i.i.d observations from N(µ2, σ22 ). Also assume that X's and Y's are independent. Suppose that µ1, µ2, σ12 , σ22  are unknown. Find an approximate 95% confidence interval for µ1µ2.
Sample 1 has n1 independent random variables (X1, X2, ...Xn1) following normal distribution, N(µ1, σ21); and...
Sample 1 has n1 independent random variables (X1, X2, ...Xn1) following normal distribution, N(µ1, σ21); and Sample 2 has n2 independent random variables (Y1, Y2, ...Yn2 ) following normal distribution, N(µ2, σ22). Suppose sample 1 mean is X bar, sample 2 mean is Y bar , and we know n1 ,n2 , σ21 and σ22. Construct a Z statistic (i.e. Z~N(0, 1)) to test H0 : µ1 = µ2.
Let (X1, X2) have a bivariate normal distribution with mean vector (µ1, µ2), variance σ 12...
Let (X1, X2) have a bivariate normal distribution with mean vector (µ1, µ2), variance σ 12 for X1 and σ 2 2 for X2 and correlation cor(X1, X2) = ρ. (a) Write down the joint density f(x1, x2). (b) Find the marginal distribution f(x1) (c) Find the conditional distribution f(x1 | x2) and the mean and variance of the conditional distribution. (d) Obtain the likelihood equations and calculate the MLE for µ1, µ2, σ12 , σ2 2 , ρ.
Let X1 and X2 be independent standard normal variables X1 ∼ N(0, 1) and X2 ∼...
Let X1 and X2 be independent standard normal variables X1 ∼ N(0, 1) and X2 ∼ N(0, 1). 1) Let Y1 = X12 + X12 and Y2 = X12− X22 . Find the joint p.d.f. of Y1 and Y2, and the marginal p.d.f. of Y1. Are Y1 and Y2 independent? 2) Let W = √X1X2/(X12 +X22) . Find the p.d.f. of W.
(a) Suppose X1, ..., Xn1 i.i.d. ∼ N(µ1, σ2 1 ) and Y1, ...Yn2 i.i.d. ∼...
(a) Suppose X1, ..., Xn1 i.i.d. ∼ N(µ1, σ2 1 ) and Y1, ...Yn2 i.i.d. ∼ N(µ2, σ2 2 ) are independent Normal samples. Suggest an unbiased estimator for µ1 − µ2 and find its’ standard error. Now Suppose n1 = 100, n2 = 200, it is known that σ 2 1 = σ 2 2 = 1, and we calculate X¯ = 5.7, Y¯ = 5.2. Find a 2-standard-error bound on the error of estimation. (b) Suppose X ∼...
Let X1, X2, X3 be independent having N(0,1). Let Y1=(X1-X2)/√2, Y2=(X1+X2-2*X3)/√6, Y3=(X1+X2+X3)/√3. Find the joint pdf...
Let X1, X2, X3 be independent having N(0,1). Let Y1=(X1-X2)/√2, Y2=(X1+X2-2*X3)/√6, Y3=(X1+X2+X3)/√3. Find the joint pdf of Y1, Y2, Y3, and the marginal pdfs.
Let X1 and X2 be uniform on the consecutive integers -n, -(n+1), ... , n-1, n....
Let X1 and X2 be uniform on the consecutive integers -n, -(n+1), ... , n-1, n. Use convolution to find the mass function for X1 + X2.
Let X1 and X2 have the joint pdf f(x1,x2) = 2 0<x1<x2<1; 0.  elsewhere (a) Find the...
Let X1 and X2 have the joint pdf f(x1,x2) = 2 0<x1<x2<1; 0.  elsewhere (a) Find the conditional densities (pdf) of X1|X2 = x2 and X2|X1 = x1. (b) Find the conditional expectation and variance of X1|X2 = x2 and X2|X1 = x1. (c) Compare the probabilities P(0 < X1 < 1/2|X2 = 3/4) and P(0 < X1 < 1/2). (d) Suppose that Y = E(X2|X1). Verify that E(Y ) = E(X2), and that var(Y ) ≤ var(X2).
Consider the following linear programming problem Maximize $1 X1 + $2 X2 Subject To 2 X1...
Consider the following linear programming problem Maximize $1 X1 + $2 X2 Subject To 2 X1 + X2 ≤ 8 Constraint A X1 + X2 ≤ 5 Constraint B X1, X2 ≥ 0 Constraint C Note: Report two digits after the decimal point. Do NOT use thousands-separators (,) 1 - Which of the following is the correct standard maximization form for the above linear programming problem AnswerCorrectNot Correct AnswerCorrectNot Correct AnswerCorrectNot Correct AnswerCorrectNot Correct Z -X1 - 2 X2 =...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT