In: Math
The distribution of wait times for a movie to load on Hulu is
normally distributed with a standard deviation of 4.4 seconds. What
changes the mean is how long it takes Windows to give the Hulu
player priority in the process queue. I want to know the average on
my old computer, so I randomly sample it 19 times, and I get an
average of 668 seconds. Find a 87% confidence interval for the
average time my old computer takes to play a movie on Hulu.
Ues 2 decimal places.
Solution :
Given that,
Point estimate = sample mean =
= 668
sample standard deviation = s = 4.4
sample size = n = 19
Degrees of freedom = df = n - 1 = 18
At 87% confidence level the t is ,
= 1 - 87% = 1 - 0.87 = 0.13
/ 2 = 0.13 / 2 = 0.065
t
/2,df = t0.065,18 = 1.587
Margin of error = E = t
/2,df
* (s /n)
= 1.587 * (4.4 /
19)
= 0.636
The 87% confidence interval estimate of the population mean is,
- E <
<
+ E
668 - 0.636 <
< 668 + 0.636
667.364 <
< 668.636
A 87% confidence interval for the average time my old computer takes to play a movie on Hulu is,
(667.364 , 668.636)