A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 1.039 m (3.41 ft) by the time it reaches home plate 18.3 m (60 ft) away. How fast was the pitcher's pitch?
-
First use a vertical position equation to find the time it takes for the pitch to drop
S = -a/2t^2+V0T+S0 V0 is initial velocity, which is zero for the vertical portion of the pitch. S0 is initial position, which we'll call zero. S = final position = -1.039 meters (negative becase it's down).
Now,
-1.039 = -4.9t^2
t=0.46 seconds for the ball to drop.
Now do the horizontal portion. From the previous part, we know it takes 0.46 seconds for the ball to drop that far, which also means it takes 0.46 seconds for the ball to reach home plate. If home plate is 18.3 meters away
V = distance/time = 18.3 metes/ 0.46 seconds = 39.8 m/s (That's an 89 mile per hour pitch)
S = -a/2t^2+V0T+S0 V0 is initial velocity, which is zero for the vertical portion of the pitch. S0 is initial position, which we'll call zero. S = final position = -1.039 meters (negative becase it's down).
Now,
-1.039 = -4.9t^2
t=0.46 seconds for the ball to drop.
Now do the horizontal portion. From the previous part, we know it takes 0.46 seconds for the ball to drop that far, which also means it takes 0.46 seconds for the ball to reach home plate. If home plate is 18.3 meters away
V = distance/time = 18.3 metes/ 0.46 seconds = 39.8 m/s (That's an 89 mile per hour pitch)
-
screw with your teacher
say you forgot to factor in the fact that the pitcher is throwing at a downward angel to start with
that means any answer is wrong
say you forgot to factor in the fact that the pitcher is throwing at a downward angel to start with
that means any answer is wrong
-
Time to fall 1.039m. = sqrt.(2h/g) = 0.46 secs.
(18.3/0.46 = 39.78m/sec.
(18.3/0.46 = 39.78m/sec.