In: Statistics and Probability
[URGENT] An Uber driver only provides service in city A and city B dropping off passengers and imme-
diately picking up a new one at the same spot. He finds the following Markov dependence.
For each trip, if the driver is in city A, the probability that he has to drive passengers to
city B is 0.25. If he is in city B, the probability that he has to drive passengers to city A
is 0.45.
(a) What is the 1-step transition matrix? (Let 1 = City A and 2 = City B)
(b) After many trips between the two cities, what is the probability he will be in city B?
(c) Suppose he is in city B, what is the probability he will be in city A after two trips?
ANSWER::
(a)
Let the State 1 and State 2 denotes the Uber Driver providing service in City A and City B respectively.
The transition probability from state 1 to state 2 is 0.25. The transition probability from state 1 to state 1 is 1- 0.25 = 0.75
The transition probability from state 2 to state 1 is 0.45. The transition probability from state 2 to state 2 is 1- 0.45 = 0.55
1-step transition matrix is,
b)
Let be the long run proportion of time that Uber driver is in City A or City B respectively.
So, and a + b = 1 => b = 1 - a
0.75a + 0.45b = a
-0.25a + 0.45b = 0
-0.25(1 - b) + 0.45b = 0
-0.25 + 0.25 b + 0.45b = 0
0.7b = 0.25
b = 0.25 / 0.7 = 0.3571
After many trips, probability that he will be in city B is 0.3571.
c)
Let Xn = 1 or 2 denote the Uber driver is in City A or City B respectively.
Probability that he will be in City A after two trips given that he is in City B
= P(X0 = 2, X2 = 1 , X3 = 1) + P(X0 = 2, X2 = 2 , X3 = 1)
= 0.45 * 0.75 + 0.55 * 0.45
= 0.585
NOTE:: I HOPE YOUR HAPPY WITH MY ANSWER....***PLEASE SUPPORT ME WITH YOUR RATING...
***PLEASE GIVE ME "LIKE"...ITS VERY IMPORTANT FOR ME NOW....PLEASE SUPPORT ME ....THANK YOU
**PLEASE [LEASE PLEASE [LEASE GIVE ME "LIKE".....