If you average 60 miles per hour for the entire distance your driving time would be about 28 hours and 40 minutes.
To calculate the runner's average speed, you would use the formula: speed = distance ÷ time. In this case, the distance is 100 meters and the time is 15 seconds. Therefore, the average speed of the runner would be 100 meters ÷ 15 seconds = 6.67 meters per second.
To calculate average speed, you would use the formula: average speed = total distance / total time. In this case, the total distance is 100 meters and the total time is 55 seconds. Therefore, the average speed would be 100 meters / 55 seconds = 1.82 meters per second.
65sec
Average speed = Distance divided by time taken and the unit is meters per second (written m/s), so remember to convert distance to meters and time to seconds if it not already given to you as these.
To get average speed, divide distance travelled by the time. The answer will be in meters/second. If you want to have your result in km/hour, multiply the number of meters/second by 3.6.
The car's average speed is calculated by dividing the distance traveled by the time taken. In this case, the average speed would be 25 meters per second (5 meters / 0.2 seconds).
To calculate the average speed of a cheetah that sprints 100 meters in 4 seconds, you divide the distance by the time. The average speed is 100 meters divided by 4 seconds, which equals 25 meters per second. Therefore, the average speed of the cheetah is 25 m/s.
7.14 seconds.
320 meters
Since length is measured in meters, and time in seconds, it follows that speed is measured in meters/second.
Average acceleration = change in speed/time for the change = 15/5 = 3 meters per second2 .