In: Advanced Math
Diane has decided to play the following game of chance. She places a $1 bet on each repeated play of the game in which the probability of her winning $1 is .6. She has further decided to continue playing the game until she has either accumulated a total of $3 or has lost all her money. What is the probability that Diane will eventually leave the game a winner if she started with a capital of $1? Of $2?
capital of $1 | ||
capital of $2 |
Solving Probability Problems using Markov Processing:
Markov Chains are probabilistic models that are used to describe how one state can change to another states or how to stay at that state, with each path assigned a certain probability. Note that the total of the probabilities in jumping from a certain state must be unity. Matrix operations are used then to identify the probabilities after nn runs, and to identify the final probability, the iterations are just repeated.
Answer and Explanation:
In using the Markov process in this problem, the transition matrix must first be identified. For this problem, the following are known:
1. Probability of winning is and of losing is
2. Once Diane has no more money, the game stops it stays that way.
3. Once Diane has , the game stops and it stays that way.
4. For as long as she has money less than , the game will not stop.
Using these facts, the transition matrix is then defined as follows:
The initial matrix representing that she has
The probabilities after one run is then computed using the formula
which gives ,
The formula is just implemented 31 times for this problem, which results to
The same is done for the case wherein she has ,but the initial state matrix is then ,
, representing that she has
Doing the same process as for the case 31 times yields