In: Math
A research council wants to estimate the mean length of time (in minutes) that the average U.S. adult spends watching television using digital video recorders (DVR’s) each day. To determine the estimate, the research council takes random samples of 35 U.S. adults and obtains the following times in minutes.
24 |
27 |
26 |
29 |
33 |
21 |
18 |
24 |
23 |
34 |
17 |
15 |
19 |
23 |
25 |
29 |
36 |
19 |
18 |
22 |
16 |
45 |
32 |
12 |
24 |
35 |
14 |
40 |
30 |
19 |
14 |
28 |
32 |
15 |
39 |
From past studies, the research council has found that the standard deviation time is 4.3 minutes and that the population of times is normally distributed.
Construct a 90% confidence interval for the population mean.
Construct a 99% confidence interval for the population mean.
Interpret the results and compare the widths of the confidence intervals.
Test the claim that the mean time spent watching DVR’s is 20 minutes each day using a significance level of 0.05.
You may use Stat Disk, TI-84 calculator, or CrunchIt to find the confidence intervals. Be sure to show your results in your post.
#### By usingExcel function "=AVERAGE(F23:F57)"
The sample mean is
## By using Excel command:
=STDEV(F23:F57)
The sample standard deviation is:
s=8.32
a) The 90% confidence interval for the population mean is given by:
From the above confidence interval we are 90% confident that the mean length of time (in minutes) that the average U.S. adult spends watching television using digital video recorders (DVR’s) each day lies within this range.
b) The 99% confidence interval for the population mean is given by:
From the above confidence interval we are 99% confident that the mean length of time (in minutes) that the average U.S. adult spends watching television using digital video recorders (DVR’s) each day lies within this range.
From the two confidence interval we observe that the 99% confidence interval is wider than the 90% confidence interval.
c) We want to test the claim that the mean time spent watching DVR’s is 20 minutes each day using a significance level of 0.05
The hypothesis testing problem is:
Vs
The test statistics is:
The critical t value at 5% level of significance with 34 degrees of freedom is =2.032
Since 3.59>2.032 we reject the null hypothesis at 5% level of significance.
Therefore we don't have sufficient evidence to support the claim that the mean time spent watching DVR’s is 20 minutes each day.