In: Physics
A football is to be thrown by a quarterback to a receiver who is running at a constant velocity of 10.4 m/s directly away from the quarterback, who intends for the ball to be caught a distance of 38.4 m away. At what distance should the receiver be from the quarterback when the ball is released? Assume the football is thrown at an initial angle of 45.0° and that it is caught at the same height at which it is released.
The range of projectile on level of ground which is given as :
using an equation, R = vr2 sin 2 / g { eq.1 }
where, vr = initial velocity of the receiver = 10.4 m/s
g = acceleration due to gravity = 9.8 m/s2
inserting the values in above eq.
R = (10.4 m/s)2 sin (2 x 450) / (9.8 m/s2)
R = (108.1 m2/s2) / (9.8 m/s2)
R = 11.03 m
time taken by the receiver which is given by :
x = vox t (v0 cos ) t
t = x / (v0 cos ) { eq.2 }
x = horizontal distance between quarterback & receiver = 38.4 m
inserting the values in eq.2,
t = (38.4 m) / [(10.4 m/s) cos 450]
t = (38.4 m) / (7.35 m/s)
t = 5.22 sec