Question

In: Math

X1,...,Xn are i.i.d Γ(α1,β),...,Γ(αn,β) respectively. Show S = X1 +···+Xn ∼ Γ(α1 +···,αn,β).

X1,...,Xn are i.i.d Γ(α1,β),...,Γ(αn,β) respectively. Show S = X1 +···+Xn ∼ Γ(α1 +···,αn,β).

Solutions

Expert Solution

are identical and independently distributed (i.i.d) gamma variables as , then the probability density function of is given by the convolution of the probability density functions of respectively. Now, let and be two identical and independently distributed (i.i.d) gamma variables as and , then the probability density function of is given by the convolution of the probability density functions and of and respectively. Thus,

The above integral can be simplified to:

Now, substituting we get,

where,

That is

Thus, the additive property for two independent gamma variables is true,

Hence, let the property be true for variables, i.e. , let

where, are identical and independently distributed (i.i.d) gamma variables as

Then, and is independently distributed.

Now, given, and . Thus,

The above integral can be simplified to:

Now, substituting we get,

where,

Thus,

Hence, proved.


Related Solutions

Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for...
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for any α ∈ (0, 1), there is a cn,α, such that [max(X1,...,Xn),cn,α max(X1,...,Xn)] is a 1−α confidence interval of θ.
Let X1, . . . , Xn i.i.d. Uniform(θ, θ + 1). Show that: ˆθ1 =...
Let X1, . . . , Xn i.i.d. Uniform(θ, θ + 1). Show that: ˆθ1 = X¯ − 1 2 and ˆθ2 = X(n) − n n + 1 are both consistent estimators for θ.
1. Homework 4 Due: 2/28 Let X1,...,Xn i.i.d. Gamma(α,β) with α > 0, β > 0...
1. Homework 4 Due: 2/28 Let X1,...,Xn i.i.d. Gamma(α,β) with α > 0, β > 0 (a) Assume both α and β are unknown, find their momthod of moment estimators: αˆMOM and βˆMOM. (b) Assume α is known and β is unknown, find the maximum likelihood estimation for β. 2.
Suppose X1, ..., Xn are i.i.d. from an exponential distribution with mean θ. If we are...
Suppose X1, ..., Xn are i.i.d. from an exponential distribution with mean θ. If we are testing H0 : θ = θ0 vs Ha : θ > θ0. Suppose we reject H0 when ( X¯n/ θ0) > 1 + (1.645/ √n) (a) (10 points) Calculate the power function G(ζ). You may leave your answer in terms of the standard normal cdf Φ(x). (b) (5 points) Is this test consistent?
R simulation: Let X1, . . . , Xn be i.i.d. random variables from a uniform...
R simulation: Let X1, . . . , Xn be i.i.d. random variables from a uniform distribution on [0, 2]. Generate and plot 10 paths of sample means from n = 1 to n = 40 in one figure for each case. Give some comments to empirically check the Law of Large Numbers. (a) When n is large, X1 + · · · + Xn/n  converges to E[Xi]. (b) When n is large, X1^2+ · · · + Xn^2/n converges to...
Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1,...
Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1, σ12 ) and Y1, Y2, · · · , Yn be i.i.d observations from N(µ2, σ22 ). Also assume that X's and Y's are independent. Suppose that µ1, µ2, σ12 , σ22  are unknown. Find an approximate 95% confidence interval for µ1µ2.
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U =...
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U = ( 1, if X1 = 1, 0, if X1 > 1) find a sufficient statistic T for p. find E(U|T)
Suppose X1; : : : ; Xn is i.i.d Exponential distribution with density f(xjθ) = (1/θ)...
Suppose X1; : : : ; Xn is i.i.d Exponential distribution with density f(xjθ) = (1/θ) * e(-x/θ); 0 ≤ x < 1; θ > 0: (a) Find the UMVUE (the best unbiased estimator) of θ. (b) What is the Cramer-Rao lower bound of all unbiased estimator of all unbiased estimator of θ. Does the estimator from (a) attain the lower bound? Justify your answer. (c) What is the Cramer-Rao lower bound of all unbiased estimator of θ^2? 3 (d)...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is bounded and monotone. Find the limit. Prove by induction
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class...
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class we have shown a central limit theorem, ¯ Xn /pn )N(0,1), as n!1 , (1) with the assumption E(X1) = 0. Using (1), we now prove the theorem for a more general E(X1)=µ6=0 case. Now suppose X1,...,Xn are i.i.d. random variables with mean µ6= 0 and variance 2. (a) Show that for dummy random variables Yi = Xi µ, E(Yi) = 0 and V...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT