Assume the runner can maintain level pace for the duration of the run.
1 mile per 10 minutes = 6 mph
A marathon course measures 26 miles 385 yd = 26.21875 miles.
Velocity (speed) = Distance ÷ Time : Time = Distance ÷ Velocity
Time taken = 26.21875/6 = 4.3698 hours = 4 hours 22 minutes 12 seconds
You can't. It would be like extrapolating a marathon time from a mile time.
It takes less time to walk a mile!
The excess is 1/2 minute per mile.The marathon is 26 and (385/1760) miles.The aggregate excess time is (0.5) x (26-385/1760) = 13.109 minutes (rounded) = 13min 6.6sec
In one minute, a car traveling at a speed of 60 miles per hour would cover a distance of 1 mile.
Well if your running at a 7 minute mile pace it would take about 35 mins.
well, you have to figure that @ 60 mph, the car would travel a mile a minute. not sure @ this time what the exact answer is
Assuming a distance of 42.195 kilometers, the average speed for a 4 hour 31 minute time would be 5.8 miles per hour or 9.3 km per hour.
Most people would run about a 6 minute mile to achieve a time of 18 minutes.
Average would be about a 10 minute per mile pace, so 30 mins. A decent time is sub 20 minutes.
Eamonn Caughlin broke a 4 minute mile after age 40.
1 minute 14 seconds
The time required to travel 1/10 of a mile would depend upon the speed.The latitude/longitude "minute" is 1/60 of a degree, and is measured as flat on maps although it is on the curved exterior of the Earth. A minute of latitude is equal to a nautical mile, which is 1.15 miles or 1.85 kilometers. A minute of longitude is equal to that only at the equator, and becomes less as you move toward the poles and the longitude lines converge.So there would be 0.087 latitude minutes in 1/10 of a mile.