In: Statistics and Probability
Suppose it is desired to estimate the average time a customer spends in a particular store to within 5 minutes with 99% confidence. It is estimated that the range of the times a customer spends in the store is 90 minutes. How large a sample should be taken to get the desired interval? Right-click the link to use this Z table
We know the standard deviation is approx. 1/4 of the range. So we have
estimated standard deviation = = 90/4 =22.5
The following information has been provided: |
Let me know in the comments if anything is not clear. I will reply ASAP! Please do upvote if satisfied!