Question

In: Statistics and Probability

1) Describe and elaborate upon the following (a) A beta distribution (b) A joint probability density...

1) Describe and elaborate upon the following

(a) A beta distribution

(b) A joint probability density function

(c) A marginal probability density function

(d) A conditional probability density function

(e) Covariance and correlation between two random variables

Solutions

Expert Solution

A) The beta distribution represents a family of probabilities and is a best way to represent outcomes for proportions. For example, how likely is it that pavan will win the next casino game? You might think the probability is 0.7. Your friend might think it’s 0.2. The beta distribution gives you a way to describe this.

The probability density function of beta distribution is given below,

All the modern Statistical packeges are able to perform the beta distribution. Beta is used as a random variable to represent a belief distribution of probabilities in contexts beyond estimating coin flips. It has many desirable properties:
it has a support range that is exactly (0,1), matching
the values that probabilities can take on and it has the expressive capacity to capture many different forms of
belief distributions.

B)Joint probability density function
Discrete case
- Suppose X and Y are two discrete random variables and that X takes values {x1, x2, . . . , xn}
and Y takes values {y1, y2, . . . , ym}. The ordered pair (X, Y ) take values in the product
{(x1, y1),(x1, y2), . . .(xn, ym)}.

The joint probability mass function (joint pmf) of X and Y
is the function p(xi
, yj ) giving the probability of the joint outcome X = xi
, Y = yj .
In discrete case we use PMF

A joint probability mass function must satisfy two properties:
1. 0 ≤ p(xi, yj ) ≤ 1
2. The total probability is 1. We can express this as doubleum:
Xn m
i=1
Sum(xi, yj ) = 1.

This is the Discrete case.

Continuous case: The continuous case is essentially the same as the discrete case: we just replace discrete sets
of values by continuous intervals, the joint probability mass function by a joint probability
density function, and the sums by integrals.
If X takes values in [a, b] and Y takes values in [c, d] then the pair (X, Y ) takes values in
the product [a, b] × [c, d]. The joint probability density function (joint pdf) of X and Y
is a function f(x, y) giving the probability density at (x, y). That is, the probability that
(X, Y ) is in a small rectangle of width dx and height dy around (x, y) is f(x, y) dx dy.

A joint probability density function must satisfy two properties:
1. 0 ≤ f(x, y)
2. The total probability is 1. We now express this as a double integral:
f(x, y) dx dy = 1
c a
Note: as with the pdf of a single random variable, the joint pdf f(x, y) can take values
greater than 1; it is a probability density, not a probability.
Here We won’t expect you to be experts at double integration. Here’s what we will
expect.
• You should understand double integrals conceptually as double sums.
• You should be able to compute double integrals over rectangles.

This is all about the joint density function.

C) Marginal probability density function:

Discrete Case:

• Marginal Probability Mass Function
If X and Y are discrete random variables
with joint probability mass function fXY (x, y),
then the marginal probability mass functions
of X and Y are
fX(x) = fXY (x, y)
and
fY (y) = fXY (x, y)
where the sum for fX(x) is over all points in
the range of (X, Y ) for which X = x and the
sum for fY (y) is over all points in the range
of (X, Y ) for which Y = y.

Continuous Case:

Marginal Probability Density Function
If X and Y are continuous random variables
with joint probability density function fXY (x, y),
then the marginal density functions for X and
Y are
fX(x) = fXY (x, y) dy
and
fY (y) = ​​​​​ fXY (x, y) dx
where the first integral is over all points in
the range of (X, Y ) for which X = x, and
the second integral is over all points in the
range of (X, Y ) for which Y = y.

D) Conditional Probability density function:

​​​​Conditional Distributions
Before we looked at conditional probabilities for events. Here we formally go over conditional
probabilities for random variables. The equations for both the discrete and continuous case are below,

Discrete The conditional probability mass function (PMF) for the discrete case:

pX|Y (x|y) = P(X = x|Y = y) =P(X = x,Y = y)/P(Y = y)
=PX,Y (x, y)/pY (y)
The conditional cumulative density function (CDF) for the discrete case:
FX|Y (a|y) = P(X ≤ a|Y = y) =Sumx≤a pX,Y (x, y)/pY (y)
=Sumx≤apX|Y (x|y)
Continuous
The conditional probability density function (PDF) for the continuous case:
fX|Y (x|y) =fX,Y (x, y)/fY (y)
The conditional cumulative density function (CDF) for the continuous case:
FX|Y (a|y) = P(X ≤ a|Y = y) =fX|Y (x|y)dx

This is the conditional Probability density function.In stat, given two joint rv x and y the conditional probability distribution of Y given X is prb.distri. of Y when X is known to be a value; in some cases the conditional probabilities may expressed as functions containing the unspecified value x of X as a parameter. When both X and Y are nonnumaric value, a Conditional Probability table is typically used to represent the conditional probability. The conditional distribution contrasts with the marginal of a random variable, which is its distribution without reference to the value of the other variable.

E) The correlation between Two random variable is definited as the association between two random variable. Correlation will tells us how Two random variable is related with each other is they are linearly related or not. The correlation always lie between-1 to 1 and if the value is close to -1 the we can say that Two random variable are negativity correlated. If the value is close to 1 then we can say that there is the storng positive correlation between them.

On Other hand the covariance between Two variable can say that how two random variable are moving to each other this is the main difference between correlation and covariance.

Here is the detailed answer of your 5 questions.

If you understood then RATE POSITIVE ?.In case of any query please reply in comment box, thank you.


Related Solutions

1) Define and elaborate upon the following: (a) A probability density function (b) A Poisson distribution...
1) Define and elaborate upon the following: (a) A probability density function (b) A Poisson distribution (c) A hypergeometric distribution (d) What does the value of a probability density function denote?
1) Define and elaborate upon the following: (a) A probability mass function (b) A cumulative distribution...
1) Define and elaborate upon the following: (a) A probability mass function (b) A cumulative distribution function (c) A discrete uniform distribution (d) A Bernoulli trial (e) A Binomial distribution
Give an example of a function F which is the joint probability distribution (not density) function...
Give an example of a function F which is the joint probability distribution (not density) function of a pair of random variables X and Y such that (a) X and Y are independent and discrete (b) X and Y are dependent and discrete (c) X and Y are independent and continuous (d) X and Y are dependent and continuous
18. Joint Probability Calculations: a) What is a joint probability density function? i. What is an...
18. Joint Probability Calculations: a) What is a joint probability density function? i. What is an example of a discrete joint probability function? ii. What is an example of a continuous joint probability function? b) What are marginal density/mass distribution functions? How can we calculate them from the joint density/mass distribution functions? i. Can you calculate the marginals when the joint space is not a rectangle? (e.g. The space of jpdf[x,y] is 0 < x < y, 0 < y...
If a random variable X has a beta distribution, its probability density function is fX (x)...
If a random variable X has a beta distribution, its probability density function is fX (x) = 1 xα−1(1 − x)β−1 B(α,β) for x between 0 and 1 inclusive. The pdf is zero outside of [0,1]. The B() in the denominator is the beta function, given by beta(a,b) in R. Write your own version of dbeta() using the beta pdf formula given above. Call your function mydbeta(). Your function can be simpler than dbeta(): use only three arguments (x, shape1,...
If a random variable X has a beta distribution, its probability density function is fX (x)...
If a random variable X has a beta distribution, its probability density function is fX (x) = 1 xα−1(1 − x)β−1 B(α,β) for x between 0 and 1 inclusive. The pdf is zero outside of [0,1]. The B() in the denominator is the beta function, given by beta(a,b) in R. Write your own version of dbeta() using the beta pdf formula given above. Call your function mydbeta(). Your function can be simpler than dbeta(): use only three arguments (x, shape1,...
If a random variable X has a beta distribution, its probability density function is fX (x)...
If a random variable X has a beta distribution, its probability density function is fX (x) = 1 xα−1(1 − x)β−1 B(α,β) for x between 0 and 1 inclusive. The pdf is zero outside of [0,1]. The B() in the denominator is the beta function, given by beta(a,b) in R. Write your own version of dbeta() using the beta pdf formula given above. Call your function mydbeta(). Your function can be simpler than dbeta(): use only three arguments (x, shape1,...
. The joint probability density function of X and Y is given by ?(?, ?) =...
. The joint probability density function of X and Y is given by ?(?, ?) = { ??^2? ?? 0 ≤ ? ≤ 2, 0 ≤ ?, ??? ? + ? ≤ 1 0 ??ℎ?????? (a) Determine the value of c. (b) Find the marginal probability density function of X and Y. (c) Compute ???(?, ?). (d) Compute ???(?^2 + ?). (e) Determine if X and Y are independent
Derive conditional expectation from joint probability density function.
Derive conditional expectation from joint probability density function.
The joint probability density function for two continuous random variables ?? and ?? is ??(??, ??)...
The joint probability density function for two continuous random variables ?? and ?? is ??(??, ??) = ??(3??2 − ??), 0 < ?? < 1, 0 < ?? < 1. Answer the following: (a) Find the value of ?? such that ??(??, ??) is a valid probability density function. (b) Find the marginal probability density function for ??, ??(??). (c) Find the marginal probability density function for ??, ??(??). (d) Find the conditional probability density function for ??|?? = ??,...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT