In: Physics
suppose you throw a rock off a 35 meter cliff at an angle of 42 degrees above horizontal at a speed of 20 m/s. how far horizontally from the base of the cliff will the rock hit the ground below
For a throw on the oblique the equations on the 2 axes (Oy=positive sense upward) are:
For maximum height the condition is vy=0. From here, the climbing time is extracted:
If this is replaced in the equation of movement on Ox axis, the horizontal distance corresponding to the maximum height is obtained. The distance, on the horizontal, where the rock reaches the same level of the cliff (the point where it was thrown from), the so-called " is given by the equation:
xm=horizontal distance corresponding to the maximum height (ym).
The total time for is 2tc~2.7 s
Until now I worked considering the cliff and the movement as far as its horizontal level the first part of the movement.
The next part is from the point where the rock goes beneath the Ox axis (point M in the figure).
At point M the speeds are:
The height the rock falls beneath M is 35m. Thus, the equation of movement on Oy axis can be written:
t'=time for the rock to reach the land beneath the cliff.
Solving this equation we'll obtain t'~3.44 s (considering the positive value).
The horizontal distance (from point M on) will be given by the equation:
So, the total distance, on the horizontal, from the cliff to the point where the rockk hits the land, is:
2xm+x~91.7 m