In: Statistics and Probability
1) What makes events independent/dependent? Give an example of each. 2) What does the probability distribution of a discrete random variable tell you? How do you graphically display the probability distribution of a discrete random variable? 3) Give two examples of Bernoulli trials.
2.2 ) Discrete random variables can take on either a finite or at most a countably infinite set of discrete values (for example, the integers). Their probability distribution is given by a probability mass function which directly maps each value of the random variable to a probability. For example, the value of x1x1 takes on the probability p1p1, the value of x2x2takes on the probability p2p2, and so on. The probabilities pipi must satisfy two requirements: every probability pipi is a number between 0 and 1, and the sum of all the probabilities is 1. (p1+p2+⋯+pk=1p1+p2+⋯+pk=1)
3 )
Bernoulli trials are such trials which has only two possible
outcomes. For example, tossing a coin will give only two outputs,
head or tail. Such a trial is known as Bernoulli trial. If
probability of one of the event will be p then the probability of
the other event will be (1 - p). The bernoulli probability
distribution is written for success in r trials out of n trials
as,
EXAMPLES
Problem 1:
If the probability of a bulb being defective is 0.8, then what is
the probability of the bulb not being defective.
Solution:
Probability of bulb being defective, p = 0.8
Probability of bulb not being defective, q = 1 - p = 1 - 0.8 =
0.2
Problem 2:
10 coins are tossed simultaneously where the probability of getting
head for each coin is 0.6. Find the probability of getting 4
heads.
Solution:
Probability of getting head, p = 0.6
Probability of getting head, q = 1 - p = 1 - 0.6 = 0.4
Probability of getting 4 heads out of 10,
P(X=4)=C104(0.6)4(0.4)6P(X=4)=C410(0.6)4(0.4)6 = 0.111476736