In: Statistics and Probability
What is the difference between reporting an estimate of mu using a margin of error versus using a confidence interval?
Margin of error = Critical value x Standard error of the statistic
100(1-a)% confidence interval of population parameter is
(value of statistic-margin of error, value of statistic+margin of error)
Therefore margin of error is the range of values below and above the sample statistic in a confidence interval or the "radius" (or half the width) of a confidence interval whereas the confidence interval is a procedure to show what the uncertainty is with a certain statistic. A confidence interval gives an estimated range of values which is likely to include an unknown population parameter, the estimated range being calculated from a given set of sample data.
A margin of error states that how many percentage points our outcomes will differ from the real population value. For example, a 100(1-a)% confidence interval with a b% margin of error imples the statistic will be within b% points of the real population value 100(1-a)% of the time. We generally choose a=0.01,0.05, 0.1 etc.