In: Statistics and Probability
The distribution of sample means has the same mean as the underlying population, but the standard deviation differs. Use a real world scenario to explain why it makes sense the variation decreases as the sample size increases.
Please make copy paste available
The Variability that's shrinking when sample size increases is the variability of the sample mean.
Imagine you run an experiment where you collect 3 men and 3 women and measure their height. How certain are you that the mean heights of each group are the true mean of the seperate populations of men and women? I should think that you wouldn't be very certain at all. You could easily collect new samples of 3 and find new means several inches from the first ones. Quite a few of the repeated experiments like this might even result in women being pronounced taller than men because the means would vary so much. With a low sample size you don't have much certainty in the mean from the sample and it varies a lot across samples.
Now imagine 10000 observations in each group. It's going to be pretty hard to find new samples of 10000 that have means that differ much from each other. They will be far less variable and you'll be more certain of their accuracy.
If you can accept this line of thinking then we can insert it into the calculations of your statistics as standard error. As you can see from its equation, it's an estimation of a parameter , ( that should become more accurate as n increases ) divided by a value that always increases with n, ?n. The standard error is representing the variability of the means or effects in your calculations. The smaller it is, the more powerful your statistical test.