A hot air balloon is descending at a rate of 2.0 m/s when a passenger drops a camera.
(a) If the camera is 35 m above the ground when it is dropped, how long does it take for the camera to reach the ground?
(b) What is its velocity just before it lands? Let upward be the positive direction for this problem.
I got it wrong, so I'm asking you guys. Thanks for the help!
When the camera was dropped, the starting speed (v_0) was 2 m/s (the speed of the hot air balloon)
And now the gravity force would effect the camera causing an acceleration of 9.8 m/(s^2)
Inserting those parameters in the equation:
y=y_0 + v_0*t + 0.5 * a * t^2
(symbols: y = distance and time t, y_0 = start destination, v_0 = starting speed, a = acceleration, t = time)
35 = 0 + 2*t + 0.5*9.8*t^2 => t = 2.47631 second, -2.88447 second (we don't want the negative answer)
Note: we took the fall point as the starting point and vertical direction to be downwards. (that's why we put y as 35 and y_0 as 0)
We know the acceleration and the starting speed and the time it hit the ground, so putting all this in the equation:
v = v_0 + a * t
(symbols: v = speed at time t, v_0 = starting speed, a = acceleration speed, t = time)
v = 2 + 9.8 * 2.47631 => v = 26.267838 m/s
Hope I helped. (and that it is correct)
0 = -35 + 2.0t + 4.9tÂ² â t = 2.476 sec
Vf = -â[2.0Â² + 2g*35] = -26.268 m/s