Question

In: Statistics and Probability

Consider two ips of a coin. Let X1 be the random variable which is 1 if...

Consider two ips of a coin. Let X1 be the random variable which is 1 if the rst coin is heads and 0 otherwise; let X2 be the random variable which is 1 if the second coin is heads and 0 otherwise; and let X3 be the random variable which is 1 if the two coins are the same and 0 otherwise. Show that these three variables are pairwise independent but not independent.

Solutions

Expert Solution

Proof for pairwise independence:

This means that if we consider these random variables in pairs, then the two variables are independent. For instance, consider the pair (X1,X2).

In this pair, the outcome of X1 does not affect the outcome of X2, and vice versa, because the coin is fair and is flipped one after the other. So the outcomes are completely random.

Next consider the pair (X1,X3). If you are given the value of X1, you cannot predict the value of X3, because X3 is decided based on both X1 and X2. Similarly, if you know the value of X3, then also you cannot tell the value of X1.

Let's suppose that X3 is equal to 1. This means that both coins are same, which means that X1 can either be 0 or 1. Thus there is pairwise independence in this pair.

Similar argument holds true for the pair (X2,X3) as well.

Next, consider the three variables all together. If they are all independent on individual level as well, then knowing the value of 2 of these variables should not help us in predicting the value of the remaining variable.

But this does not hold true in this case. Because if we are given the value of X1 and X2, we can tell the value of X3.

Similarly, if we are given the value of X2 and X3, or X1 and X3, then also it is possible to correctly tell the value of the remaining variable.

Thus they are only pairwise independent, but not independent as a whole.


Related Solutions

Let X1 and X2 be two observations on an iid random variable drawn from a population...
Let X1 and X2 be two observations on an iid random variable drawn from a population with mean μ and variance σ2 (for example, the random variable could be income; X1 could be person 1’s income and X2, person 2’s income). Which of the following is an unbiased estimator of mean income? All of these are unbiased estimators of mean income. 23X1+13X2 15X1+45X2 Which of the following is the “best” (that is, least-variance) estimator of mean income? 15X1+45X2 23X1+13X2 All...
Consider independent random variables X1, X2, and X3 such that X1 is a random variable having...
Consider independent random variables X1, X2, and X3 such that X1 is a random variable having mean 1 and variance 1, X2 is a random variable having mean 2 and variance 4, and X3 is a random variable having mean 3 and variance 9. (a) Give the value of the variance of X1 + (1/2)X2 + (1/3)X3 (b) Give the value of the correlation of Y = X1- X2 and Z = X2 + X3.
1. A coin is tossed 3 times. Let x be the random discrete variable representing the...
1. A coin is tossed 3 times. Let x be the random discrete variable representing the number of times tails comes up. a) Create a sample space for the event;    b) Create a probability distribution table for the discrete variable x;                 c) Calculate the expected value for x. 2. For the data below, representing a sample of times (in minutes) students spend solving a certain Statistics problem, find P35, range, Q2 and IQR. 3.0, 3.2, 4.6, 5.2 3.2, 3.5...
Let discrete random variable X be the number of flips of a biased coin required to...
Let discrete random variable X be the number of flips of a biased coin required to get tails, where P(tails) = 1/3 . a) Calculate the probability for every value of X from 1 to 10. b) Sketch a plot of the p.m.f. of X for the first 10 flips. c) Sketch a plot the c.d.f. of X for the first 10 flips.
Information theory Consider a random variable representing coin throws (Bernoulli Variable with Σ = {0,1} )....
Information theory Consider a random variable representing coin throws (Bernoulli Variable with Σ = {0,1} ). Let the true probability distribution be p(0) = r, p(1) = 1-r. Someone guesses a different distribution q(0) = s, q(1) = 1-s. (a) Find expressions for the Kullback–Leibler distances D(p||q) and D(q||p) between the two distributions in terms of r and s. (b) Show that in general, D(p||q) ≠ D(q||p) and that equality occurs iff r = s. (c) Compute D(p||q) and D(q||p)...
Consider rolling two dice and let (X, Y) be the random variable pair defined such that...
Consider rolling two dice and let (X, Y) be the random variable pair defined such that X is the sum of the rolls and Y is the maximum of the rolls. Find the following: (1) E[X/Y] (2) P(X > Y ) (3) P(X = 7) (4) P(Y ≤ 4) (5) P(X = 7, Y = 4)
Consider a r.v. representing coin throws (Bernoulli Variable with Σ = {0,1} ). Let the true...
Consider a r.v. representing coin throws (Bernoulli Variable with Σ = {0,1} ). Let the true probability distribution be p(0) = r, p(1) = 1-r. Someone guesses a different distribution q(0) = s, q(1) = 1-s. (a) Find expressions for the Kullback–Leibler distances D(p||q) and D(q||p) between the two distributions in terms of r and s. (b) Show that in general, D(p||q) ≠ D(q||p) and that equality occurs iff r = s. (c) Compute D(p||q) and D(q||p) for the case...
An honest coin is tossed n=3600 times. Let the random variable Y denote the number of...
An honest coin is tossed n=3600 times. Let the random variable Y denote the number of tails tossed. Use the 68-95-99.7 rule to determine the chances of the outcomes. (A) Estimate the chances that Y will fall somewhere between 1800 and 1860. (B) Estimate the chances that Y will fall somewhere between 1860 and 1890.
An unbiased coin is tossed four times. Let the random variable X denote the greatest number...
An unbiased coin is tossed four times. Let the random variable X denote the greatest number of successive heads occurring in the four tosses (e.g. if HTHH occurs, then X = 2, but if TTHT occurs, then X = 1). Derive E(X) and Var(X). (ii) The random variable Y is the number of heads occurring in the four tosses. Find Cov(X,Y).
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi is geometric with respective parameter pi. It is known that the mean of X is equal to μ, where μ > 0. Show that the variance of X is minimized if the pi's are all equal to n/μ. 2. Suppose that each Xi is Bernoulli with respective parameter pi. It is known that the mean of X is equal to μ, where μ >...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT