You throw a baseball into the air at an initial velocity of 30 feet per second. When the ball leaves your hand it is 5 feet above the ground. When you friend catches it it is at a height of 6 feet above the ground. How long is the ball in the air?

Also if you could show me how you did it with using the vertical motion model that would be awesome. Thanks!

Respuesta :

pmayl
Assuming this little game of catch took place on planet earth, negative acceleration due to gravity is -9.8 m/s .
Converting 30ft/s to m/s, initial velocity was 30ft/s x 0.305ft/meter = 9.14m/s 

Let's find how long it took for the velocity to equal zero, meaning when the ball reached it's highest point and, for a split second, stopped in mid-air before falling back down.

V(t) = Vi + a*t , where V(t) is velocity as a function of time, a is acceleration due to gravity, and t is time. Set V(t) = 0
0 = 9.14 + (-9.8)* t          Add -9.8t to both sides
9.8t = 9.14                      Divide both sides by 9.8
t = 0.93 seconds

Let's say your hand is the base point, or where h=zero. We want to find how high above your hand the ball went before it started coming down. Using the distance, or in this case height, formula:
h = Vi*t + (1/2)at²               Plug in Vi, a, and our t value, 0.93
h= 9.14 * 0.93 + (1/2)(9.8)(0.93²)
h= 8.5 + 4.9 (0.865)
h = 8.5+ 4.27
h = 12.74 meters

The ball made it 12.74 meters above your hand. Your friends had was one foot above yours, so let's subtract .305 meters to see how far it dropped from the peak height to his hand.
12.74-.305 = 12.43 meters

Let's use the distance formula again to see how long it took to come down. Remember that this time, initial velocity is zero, since the ball starts off suspended in the air.
-12.43 = 0*t + (1/2)(-9.8)(t²)         Divide both sides by -9.8/2, or -4.9
2.5374 = t²
t = 1.59

The ball took .93 seconds to go up, and 1.59 seconds to come down to your friend's glove. The total time the ball was in the air:
.93 + 1.59 = 2.52 seconds