In: Physics
A photographer in a helicopter ascending vertically at a constant rate of 15.0m/s accidentally drops a camera out the window when the helicopter is 54.0m above the ground.
a)How long will the camera take to reach the ground?
b)What will its speed be when it hits?
The ascending velocity (or) initial velocity of camera, u = 15 m/s
The height at which it will drop, h = 54 m
A)
We have, h = (1/2)gt^2 - ut
(1/2)gt^2 - ut - h = 0
(1/2)(9.8)t^2 - 15t - 54 = 0
The time taken by camera to reach the ground, t = 5.19 sec
B)
The velocity at which it hits the ground is given by
v = u + at
= 15 - 9.8 x 5.19
= -35.86 m/s