In: Math
According to Harper’s magazine, the time spend by kids in front of the television set per year can be modeled by a normal distribution with a mean equal to 1500 hours and a standard deviation equal to 250 hours. If 25 kids are randomly selected from this population, what is the probability that the average of their times spent watching television is at least 1650 hours per year?