In: Civil Engineering
You are driving along straight road. You cover 200 miles in 4 hours? Does that mean that your speedometer read 50 mph for your entire trip? Is is necessary that your speedometer register 50 mph at least once during the trip? Use math to explain your answer.
I understand that your speedometer would not be 50mph for the entire trip, but I am not sure how to explain it using math.
From the Basic Formula,
Average Speed = Total Distance / Elapsed Time = 200 / 4 = 50 mph.
Here it defines the average speed where the actual speed travelled by the person is different. The Speedometer doesnt read 50 mph during the entire ride.
By simple logic it can be done into possibilities for average speed.
We know that,
Average = Sum of the terms / No.of. terms
Here are some possibilities for each hour to have an average of 50 mph.
Possibility 1 :
For 1st Hour he moved with speed of 60 mph (registers 50 mph in speedometer)
For 2nd Hour he moved with speed of 40 mph
For 3rd Hour he moved with speed of 30 mph
For 4th Hour he moved with speed of 70 mph (registers 50 mph in speedometer)
Average = (60+40+30+70 ) / 4 = 50 mph
Possibility 2 :
For 1st Hour he moved with speed of 30 mph
For 2nd Hour he moved with speed of 20 mph
For 3rd Hour he moved with speed of 100 mph (registers 50 mph in speedometer)
For 4th Hour he moved with speed of 50 mph (registers 50 mph in speedometer)
Average = (30+20+100+50 ) / 4 = 50 mph
Possibility 3 :
For 1st Hour he moved with speed of 40 mph
For 2nd Hour he moved with speed of 30 mph
For 3rd Hour he moved with speed of 20 mph
For 4th Hour he moved with speed of 110 mph (registers 50 mph in speedometer)
Average = (40+30+20+110 ) / 4 = 50 mph
There can 'N' Number of possiblities to attain the average speed.
But is clear that speedometer will register 50 mph atleast once during the trip.