In: Statistics and Probability
Suppose we are interested in determination of confidence interval of sample mean. The confidence interval of the sample mean is given by,
[, ], where, is the sample mean, is the level of significance, n is the sample size.
We observe that the length of the confidence interval is inversely proportional to the square root of the sample size.
So, as the sample size increases, the length of the interval decreases and hence the precision of the interval increases.
There is nothing like optimal sample size. But given the other factors like margin of error, sample mean, standard deviation and level of significance, we can determine the sample size.
Ideally it is good to have a sample of size more than 30 as we can assume normal approximation of the sample distribution in that case.