In: Statistics and Probability
A random sample of 144 chicken nuggets in someone’s McDonalds order was observed. It is found that each nugget takes an average of 1.6 minutes to cook with a standard deviation of 1.3 minutes. Find a 95% confidence interval for the true mean time it takes to cook a nugget.
Solution :
Given that,
Point estimate = sample mean = = 1.6
Population standard deviation = = 1.3
Sample size n =144
At 95% confidence level the z is ,
= 1 - 95% = 1 - 0.95 = 0.05
/ 2 = 0.05 / 2 = 0.025
Z / 2 = Z0.025 = 1.96 ( Using z table )
Margin of error = E = Z
/ 2 * (
/n)
= 1.96* (1.3 / 144 )
= 0.2123
At 95% confidence interval estimate of the population mean
is,
- E < < + E
1.6 - 0.2123 <
< 1.6+ 0.2123
1.3877<
< 1.8123
( 1.3877,1.8123 )