Professionals typically use a radar gun to measure the speed of a pitched baseball. However, you can also calculate the speed of a baseball pitch with equipment you have at home. This procedure will measure the average speed of a baseball in feet per second as it travels from the pitcher’s mound to home plate. You’ll typically convert this value to miles per hour for greater convenience.
Measure the distance from the pitcher’s mound to home plate with the measuring tape. This distance should be 60.5 feet for a major league baseball diamond.
Time the baseball’s pitch from the time that it leaves the pitcher’s hand until it reaches the catcher’s glove. For example, assume that the baseball’s travel time is 0.47 seconds.
Compute the baseball’s average speed in feet per second with the formula S = D / T. The baseball’s average speed is "S," the distance that the baseball traveled is "D" and the time of the baseball’s travel is "T." If the distance is 60.5 feet and the time is 0.47 seconds, then the speed of the baseball is 60.5 / 0.47, or 129 feet per second.
Multiply the speed of the baseball in feet per second by 0.682. This will convert the speed to miles per hour (mph). For example, a speed of 129 feet per second is equal to 129 x 0.682 = 88 mph.
Calculate the speed of the baseball directly from its travel time in seconds. The equation S = (60.5 / T) * .682 = 41.3 / T will provide the baseball’s speed in miles per hour, where "T" is the time in seconds. Assuming that the distance is 60.5 feet, a travel time of 0.47 seconds indicates a speed of 41.3 / 0.47 = 88 mph.