Let X and Y be two independent and identically distributed
random variables with expected value 1 and variance 2.56.
(i) Find a non-trivial upper bound for P(|X + Y − 2| ≥ 1). 5
MARKS
(ii) Now suppose that X and Y are independent and identically
distributed N(1, 2.56) random variables. What is P(|X + Y − 2| ≥ 1)
exactly? Briefly, state your reasoning. 2 MARKS
(iii) Why is the upper bound you obtained in Part (i) so
different...
Let X and Z be two independently distributed standard normal
random variables and let Y=X2+Z.
iShow thatE(Y|X) =X2
iiShow thatμY= 1
iiiShow thatE(XY) = 0ivShow that cov(X,Y) = 0 and thus ρX,Y=
0
A: Suppose two random variables X and Y are independent and
identically distributed as standard normal. Specify the joint
probability density function f(x, y) of X and Y.
Next, suppose two random variables X and Y are independent and
identically distributed as Bernoulli with parameter 1 2 . Specify
the joint probability mass function f(x, y) of X and Y.
B: Consider a time series realization X = [10, 15, 23, 20, 19]
with a length of five-periods. Compute the...
Let ?1 , ?2 , ... , ?? be independent, identically distributed
random variables with p.d.f. ?(?) = ???−1, 0 ≤ ? ≤ 1 . c) Show that
the maximum likelihood estimator for ? is biased, and find a
function of the mle that is unbiased. (Hint: Show that the random
variable −ln (??) is exponential, the sum of exponentials is Gamma,
and the mean of 1/X for a gamma with parameters ? and ? is 1⁄(?(? −
1)).) d)...
Let X1, X2, . . . be a sequence of independent and identically
distributed random variables where the distribution is given by the
so-called zero-truncated Poisson distribution with probability mass
function; P(X = x) = λx/ (x!(eλ − 1)), x = 1, 2,
3...
Let N ∼ Binomial(n, 1−e^−λ ) be another random variable that is
independent of the Xi ’s.
1) Show that Y = X1
+X2 + ... + XN has a Poisson distribution
with mean nλ.
Let X1 and X2 be independent identically
distributed random variables with pmf p(0) = 1/4, p(1) = 1/2, p(2)
= 1/4
(a) What is the probability mass function (pmf) of X1
+ X2?
(b) What is the probability mass function (pmf) of
X(2) = max{X1, X2}?
(c) What is the MGF of X1?
(d) What is the MGF of X1 + X2
Let X1 and X2 be independent identically distributed random
variables with pmf p(0) = 1/4, p(1) = 1/2, p(2) = 1/4
(a) What is the probability mass function (pmf) of X1 + X2?
(b) What is the probability mass function (pmf) of X(2) =
max{X1, X2}?
(c) What is the MGF of X1?
(d) What is the MGF of X1 + X2? (Note: The formulas we did were
for the continuous case, so they don’t directly apply here, but you...
Let ?1. . . ?5 be identically independently distributed (iid)
variables sampled from a binomial distribution Bin(3,p).
a) Compute the likelihood function (LF).
b) Adopt the appropriate conjugate prior to the parameter p,
choosing hyperparameters optionally within the support of
distribution.
c) Using (a) and (b), find the posterior distribution of p.
d) Compute the minimum Bayesian risk estimator of p.
Let X and Y be random variables. Suppose P(X = 0, Y = 0) = .1,
P(X = 1, Y = 0) = .3, P(X = 2, Y = 0) = .2 P(X = 0, Y = 1) = .2,
P(X = 1, Y = 1) = .2, P(X = 2, Y = 1) = 0.
a. Determine E(X) and E(Y ).
b. Find Cov(X, Y )
c. Find Cov(2X + 3Y, Y ).
Let Xi's be independent and identically distributed
Poisson random variables for 1 ≤ i ≤ n. Derive the distribution for
the summation of Xi from 1 to n. (Without using MGF)