In: Statistics and Probability
An intelligence quotient, or IQ, is a measurement of intelligence derived from a standardized test such as the Stanford Binet IQ test. Scores on the test are normally distribution with a mean score of 100 and a standard deviation of 15.
a. Draw 1000 samples of size n=9 from the distribution of IQ. Calculate the sample mean of all 1000 samples.
b. Draw the histogram for all sample means in (a). What is the shape of the sample means?
c. Calculate mean of all sample means and standard deviation. Compare them with the population mean and standard deviation.
d. To simulate the sampling distribution of sample, mean of 25 again. Repeat the (a)-(c)
e. What changes in the shape of the sampling distribution with the increase in sample size?
f.What changes in standard error of the sample mean as sample size increases ?
mean of sample mean (n=9)=99.94091
sd of sample mean (n=9)=4.806963
standard error(n=9)=15/3=5
mean of sample mean (n=25)=100.0586
sd of sample mean (n=25)=2.99567
standard error(n=25)=15/5=3
As sample size increases the mean of sample means become closer to population mean 100.
As sample size increases the sd of sample means become smaller.
As sample size increases the sd of sample means become closer to theoretical standard error.
R OUTPUT and HISTOGRAM: