In: Math
a recent study in compare the time spent together by a
single and dual earner couples was 61 minutes per day, with a
standard deviation of 15.5 minutes. for the Dual earner couples the
mean number of minutes spent watching television was 48.4 minutes
and a standard deviation of 18.1 minutes. at the 0.01 significant
level we can conclude that the single earner couples on the average
spend more time watching television together? there are 15 single
earned and 12 dual earner couples studied. for calculation assume
the single earner as the first sample
H0:
H1:
The test statistic t = ()/sqrt(s1^2/n1 + s2^2/n2)
= (61 - 48.4)/sqrt((15.5)^2/15 + (18.1)^2/12)
= 1.91
DF = (s1^2/n1 + s2^2/n2)^2/((s1^2/n1)^2/(n1 - 1) + (s2^2/n2)^2/(n2 - 1))
= ((15.5)^2/15 + (18.1)^2/12)^2/(((15.5)^2/15)^2/14 + ((18.1)^2/12)^2/11)
= 22
At alpha = 0.01, the critical value is t* = 2.508
Since the test statistic value not greater than critical value (1.91 < 2.508), so we should not reject the null hypothesis.
So at 0.01 significance level, we cannot conclude that the single earner couples on the average spend more time watching television together.