5/7 of an hour
it would increase your speed in direct proportion. If time is halved, for example, speed would double
Assuming he keeps up the same average speed: the distance is 3000m / 2000m = 1 1/2 times as far, so it will take him 1 1/2 times as long, namely 25 x 1 1/2 = 37 1/2 minutes = 37 minutes 30 seconds.
500/(speed of a roadrunner in meters per second) seconds
1/10=0.1 hours
Somewhere between 8:00 - 10:00 minutes, but it also depends on your speed and terrain.
It depends on how fast you can run and for how long you can sustain that speed.
It depends on your speed.
30 minutes and 3 billion years
At a pace of 10m/sec it would take him 50 seconds.
It depends on the speed at which the runner is travelling.
assuming you are running at a constant speed exactly 6 minutes
12 minutes and 1 second at a constant 5 mph, excluding speed changes or stops.
A total of 30 seconds - assuming they run at a constant speed.
69 Minutes
It would take a cheetah approximately 13 years to reach the moon, assuming the cheetah could run at its top speed of 75 mph for the entire distance, and that the moon is about 238,855 miles away on average.
It would take ~6.67minutes to run 1km at 9kmh.
it would increase your speed in direct proportion. If time is halved, for example, speed would double