In: Statistics and Probability
1) Describe and elaborate upon the following
(a) A beta distribution
(b) A joint probability density function
(c) A marginal probability density function
(d) A conditional probability density function
(e) Covariance and correlation between two random variables
A) The beta distribution represents a family of probabilities and is a best way to represent outcomes for proportions. For example, how likely is it that pavan will win the next casino game? You might think the probability is 0.7. Your friend might think it’s 0.2. The beta distribution gives you a way to describe this.
The probability density function of beta distribution is given below,
All the modern Statistical packeges are able to perform the beta
distribution. Beta is used as a random variable to represent a
belief distribution of probabilities in contexts beyond estimating
coin flips. It has many desirable properties:
it has a support range that is exactly (0,1), matching
the values that probabilities can take on and it has the expressive
capacity to capture many different forms of
belief distributions.
B)Joint probability density function
Discrete case
- Suppose X and Y are two discrete random variables and that X
takes values {x1, x2, . . . , xn}
and Y takes values {y1, y2, . . . , ym}. The ordered pair (X, Y )
take values in the product
{(x1, y1),(x1, y2), . . .(xn, ym)}.
The joint probability mass function (joint pmf) of X and Y
is the function p(xi
, yj ) giving the probability of the joint outcome X = xi
, Y = yj .
In discrete case we use PMF
A joint probability mass function must satisfy two
properties:
1. 0 ≤ p(xi, yj ) ≤ 1
2. The total probability is 1. We can express this as
doubleum:
Xn m
i=1
Sum(xi, yj ) = 1.
This is the Discrete case.
Continuous case: The continuous case is
essentially the same as the discrete case: we just replace discrete
sets
of values by continuous intervals, the joint probability mass
function by a joint probability
density function, and the sums by integrals.
If X takes values in [a, b] and Y takes values in [c, d] then the
pair (X, Y ) takes values in
the product [a, b] × [c, d]. The joint probability density function
(joint pdf) of X and Y
is a function f(x, y) giving the probability density at (x, y).
That is, the probability that
(X, Y ) is in a small rectangle of width dx and height dy around
(x, y) is f(x, y) dx dy.
A joint probability density function must satisfy two
properties:
1. 0 ≤ f(x, y)
2. The total probability is 1. We now express this as a double
integral:
f(x, y) dx dy = 1
c a
Note: as with the pdf of a single random variable, the joint pdf
f(x, y) can take values
greater than 1; it is a probability density, not a
probability.
Here We won’t expect you to be experts at double integration.
Here’s what we will
expect.
• You should understand double integrals conceptually as double
sums.
• You should be able to compute double integrals over
rectangles.
This is all about the joint density function.
C) Marginal probability density function:
Discrete Case:
• Marginal Probability Mass Function
If X and Y are discrete random variables
with joint probability mass function fXY (x, y),
then the marginal probability mass functions
of X and Y are
fX(x) =
fXY (x, y)
and
fY (y) =
fXY (x, y)
where the sum for fX(x) is over all points in
the range of (X, Y ) for which X = x and the
sum for fY (y) is over all points in the range
of (X, Y ) for which Y = y.
Continuous Case:
Marginal Probability Density Function
If X and Y are continuous random variables
with joint probability density function fXY (x, y),
then the marginal density functions for X and
Y are
fX(x) =
fXY (x, y) dy
and
fY (y) =
fXY (x, y) dx
where the first integral is over all points in
the range of (X, Y ) for which X = x, and
the second integral is over all points in the
range of (X, Y ) for which Y = y.
D) Conditional Probability density function:
Conditional Distributions
Before we looked at conditional probabilities for events. Here we
formally go over conditional
probabilities for random variables. The equations for both the
discrete and continuous case are below,
Discrete The conditional probability mass function (PMF) for the discrete case:
pX|Y (x|y) = P(X = x|Y = y) =P(X = x,Y = y)/P(Y = y)
=PX,Y (x, y)/pY (y)
The conditional cumulative density function (CDF) for the discrete
case:
FX|Y (a|y) = P(X ≤ a|Y = y) =Sumx≤a pX,Y (x, y)/pY (y)
=Sumx≤apX|Y (x|y)
Continuous
The conditional probability density function (PDF) for the
continuous case:
fX|Y (x|y) =fX,Y (x, y)/fY (y)
The conditional cumulative density function (CDF) for the
continuous case:
FX|Y (a|y) = P(X ≤ a|Y = y) =fX|Y
(x|y)dx
This is the conditional Probability density function.In stat, given two joint rv x and y the conditional probability distribution of Y given X is prb.distri. of Y when X is known to be a value; in some cases the conditional probabilities may expressed as functions containing the unspecified value x of X as a parameter. When both X and Y are nonnumaric value, a Conditional Probability table is typically used to represent the conditional probability. The conditional distribution contrasts with the marginal of a random variable, which is its distribution without reference to the value of the other variable.
E) The correlation between Two random variable is definited as the association between two random variable. Correlation will tells us how Two random variable is related with each other is they are linearly related or not. The correlation always lie between-1 to 1 and if the value is close to -1 the we can say that Two random variable are negativity correlated. If the value is close to 1 then we can say that there is the storng positive correlation between them.
On Other hand the covariance between Two variable can say that how two random variable are moving to each other this is the main difference between correlation and covariance.
Here is the detailed answer of your 5 questions.
If you understood then RATE POSITIVE ?.In case of any query please reply in comment box, thank you.