18.75 km / hr.
-1.3- -1.2
The runner's average speed is greatest when they cover the most distance in the least amount of time. Therefore, the average speed is greatest when the slope of the distance-time graph is steepest.
To calculate the runner's average speed, you would use the formula: speed = distance ÷ time. In this case, the distance is 100 meters and the time is 15 seconds. Therefore, the average speed of the runner would be 100 meters ÷ 15 seconds = 6.67 meters per second.
The equation would be 2x+2(x+2)=28 when x= the speed of the slower runner. If you solve for x you get 6, which is the speed of the slower runner. Then if you add 2 you get the speed of the faster runner-8
The average speed of a runner who runs 500.0 m in 1.3 min is 6.41 m/s.
4.6ms
9 hours
The answer depends on the distance for the race. A marathon runner could not maintain the speed attained by a sprint runner.
4.60
he completes it in 69.9 seconds
The average speed of the runner is 1500m/(180 + 49.67) seconds = 6.531 m/s or 13 knots.