In: Statistics and Probability
A team of software engineers are testing the time taken for a particular type of modern computer to execute a complicated algorithm for factoring large numbers. They would like to estimate the mean time taken for a computer to execute the algorithm. A random sample of 21 times are collected. The mean time in this sample is 684.0 seconds and the sample standard deviation is found to be 96.9. Calculate the 95% confidence interval for the mean time taken to execute the algorithm. Give your answers to 2 decimal places. ≤ μ ≤
Solution :
Given that,
Point estimate = sample mean = = 684
sample standard deviation = s = 96.9
sample size = n = 21
Degrees of freedom = df = n - 1 = 21 - 1 = 20
At 95% confidence level the t is ,
= 1 - 95% = 1 - 0.95 = 0.05
/ 2 = 0.05 / 2 = 0.025
t /2,df = t0.025,20 = 2.086
Margin of error = E = t/2,df * (s /n)
= 2.086 * (96.9 / 26)
= 39.64
The 95% confidence interval estimate of the population mean is,
- E < < + E
684 - 39.64 < < 684 + 39.64
644.36 < < 723.64