In: Statistics and Probability
The actual time it takes to cook a ten pound turkey is a normally distributed. Suppose that a random sample of 19 ten pound turkeys is taken. Given that an average of 2.9 hours and a standard deviation of .24 hours was found for a sample of 19 turkeys, calculate a 95% confidence interval for the average cooking time of a ten pound turkey.
Solution :
Given that,
= 2.9
s = 0.24
n = 19
Degrees of freedom = df = n - 1 = 19 - 1 = 18
At 95% confidence level the t is ,
= 1 - 95% = 1 - 0.95 = 0.05
/ 2 = 0.05 / 2 = 0.025
t /2,df = t0.025,18 =2.101
Margin of error = E = t/2,df * (s /n)
= 2.101 * (0.24 / 19)
= 0.11
Margin of error = 0.11
The 95% confidence interval estimate of the population mean is,
- E < < + E
2.9 - 0.11 < < 2.9 + 0.11
2.78 < < 3.01
(2.78, 3.01 )