In: Math
Find the standard deviation for a set of data that has a mean of 100 and 95% of the data falls between 70 and 130.
** Please show me the procedure, thanks!!!
We are given,
range = 70 to 130
By 66-95-99.7 rule, it says that:
66% of the data falls under 1 standard deviation
95% of the data falls under 2 standard deviations and
99.7% of the data falls under 3 standard deviations.
For a certain range which is 70 to 130 we will use for 95% of the data :
where n shows the number of standard deviations.
Answer.
Similarly, we can also, obtain standard deviation from the upper limit too:
Answer.
---------------------------------------------------------------------------------------------------------------
For self-verification, we can also do this, if the obtained standard deviation is correct:
for 2 standard deviations means mean will deviate with 2*(SD) which is 2*(15) = 30
which is also, which is nothing but the means from which 30 units i.e. 2(SD) will deviate.
==============================================================================================