In: Economics
You’re at an arcade and you find yourself standing at a claw crane trying to win a prize One attempt at a prize will cost you a quarter. As it turns out, you’ve got a Ziploc bag full of quarters, and you decide you’ll play until they’re all gone. Each prize in the machine that you can try to pluck with the claw has some dollar value, but you can’t tell how much each item is worth, or even which are more valuable than others. Suppose you’d like to estimate not just the expected value of your winnings, but the variance too. Describe how you would design and carry out such an analysis for this particular situation.
Finding expected value of winnings without knowing the dollar value of winnings and the winning probability will be difficult.
So, to approach this question, assumptions about the dollar value of winnings should be taken along with the probability to win.
Suppose, max dollar value of winning is 1$, minimum is 0$. You are investing a quarter every time. Now, to be rational as a game seller, the probability of winning should be lower than 25%, otherwise expected winning amount will be equal to the amount invested and the game seller will not profit. So, lets assume the probability to be 20%.
Payoff for you :
1$ with 20% chance
0$ with 80% chance
Expected value of winning = 1$*20% + 0$*80% = 0.2$ for every quarter
Now suppose you have 100 quarters i.e. 25$
So, total winning value = 100*0.2 = 20$. for 100 quarters
For calculating variance , we need to calculate deviations from expected value
for 1$, deviation from 0.2$ is 0.8
for 0$, deviation from 0.2$ is 0.2
Sqare these deviations
0.8*0.8 = 0.64
0.2*0.2 = 0.04
add them,
0.64+0.04 = 0.68
Divide by 2
0.68/2 = 0.34
take sqaure root
Sqrt (0.34) = 0.5831
Variance = 58.31%
As assumptions change these numbers will change as well.
I hope this solution will give you an idea of how to make a model to reach to the exact expected solution. Thanks.