A pitcher throws a ball witha velocity of 101.0 mi/h. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate which is 60.5ft away. I don't know how to set it up please help
-
first convert mi/h to ft/sec. Then you know your horizontal velocity. Set up a table like this:
Down Horizontal
Vi: 0ft/sec 148 ft/sec
Vf: 148 ft/sec
D: 60.5 ft
A:-32 ft/sec 0 ft/sec
T:
Find the time for the horizontal section, then use that time in the Down column to get distance.
Down Horizontal
Vi: 0ft/sec 148 ft/sec
Vf: 148 ft/sec
D: 60.5 ft
A:-32 ft/sec 0 ft/sec
T:
Find the time for the horizontal section, then use that time in the Down column to get distance.