In: Economics
The diner and the waiter play a sequential game. The waiter decides whether to give good service or not: the diner then decides, in each case, whether to leave a $4 tip. If there is bad service and no tip, each gets a payoff of 0. Good service is worth $6 to the diner and costs the waiter $2. So if there is good service and a tip, the diner gets a payoff of 6-4 = 2 and the waiter a payoff of 4-2 = 2, for example.
a) What is the unique sub-game perfect equilibrium of this game. Why?
b) Explain how infinite repetition makes it impossible for tipping to occur. Construct an equilibrium where this happens.
c) Use an analogue of the Grim Trigger and find what conditions on d( =1/(1+r)) allow such an equilibrium to work. Explain your work.
d) Now suppose that with probability 1-p in each period the game will end after this period - the waiter will get a job as an actor. Now? what conditions on d are necessary.
a) Payoff matrix for a game is attached herewith
Sub-game perfect equilibrium or Nash equilibrium is cooperation state (2,2) as in this state, both are better off. Other states are implications of deciding to defect.
Every finite extensive game has a subgame perfect equilibrium
b) Nash equilibrium is good service and tipping. However, in an infinitely repeated game (as life is not infinite, infinite repeatation meaning the probability of event recurring p at every sequence).
However, payoffs for waiter are -2,0,2,4. Considering 1/4 probability of each state, waiter has a better pay off by not serving and 0 payoff by serving good (refer the payoff matrix). Waiters payoff for bad service is 0,4 and for good service is -2,2. Similarly, payoff for Diner for no tip (0,6) is higher than payoff for tip (-4,2). Therefore, both players are incentivized to defect than to cooperate. Cooperating is best social stragegy however, individual players have better payoffs in non-cooperation.
Now considering the dicounting to be D, players woulHence, infinite repetetions would incentivize both the players to defect once, to take advantage of higher payoffs. Once, either player has defected, this will enabled that both players will always defect in all future games.
c)
For diner, he would defect, only if payoff in defect is higher than payoff in cooperation
6 + 0*d/(1-d) > = 2/(1-d)
==> 6/2 >= 1/(1-d)
==> 1/3 <= 1-d
==> d <= 2/3
Not defect if d >=2/3
Similarly, Waiter will defect, only if
4 + 0*d/(1-d) > 2/(1-d)
==> 1/2 <= 1-d ==> d <= 1/2
Not defect if d >= 1/2
Therefore, for both diner & waiter to not defect, d should be greater than or equal to 2 / 3
d) p probability that game will not end, therefore player will not defect. (1-p) probability that the game will end, therefore the player will defect.
Therefore, for the equilibrium to work:
payoff of defecting should be less than payoff of not defecting.
Hence, 2/(1-d)*p >= (1-p)*[4+0*d/(1-d)]
==> 2p/(1-d) >= 4*(1-p)
==> p/2*(1-p) >= 1-d
==> d >= 1 - p/(2-2p)
This is the condition for d in terms of p.