In: Math
Suppose it is known that the IQ scores of a certain population of adults are approxi- mately normally distributed with a standard deviation of 15. A simple random sample of 25 adults drawn from this population had a mean IQ score of 105.
a) Is there evidence at 5% significance level that the average IQ in this population is not equal to 100?
Please also explain how you got the critical value.
Thanks!!!