Suppose you travel a distance of 100 miles, and it takes 1 1/2 hours to do it. Your average speed is then 100 miles divided by 1.5 hours which equals 66.67 miles per hour. When calculating miles per hour for distances that take only minutes, you convert the number of minutes to fractions of an hour.
Chat with our AI personalities
That doesn't make sense. Hour is a measure of time and mile is a measure of distance.
If you mean how many 4 miles make 1 mile then 0.25 If you mean how many 0.4 miles make 1 mile then 2.5
1 mile= 1 mile 5,280 feet= 1 mile
1 / 42 mph = 0.02381 hours per mile0.02381 hours per mile x 3600 sec. per hour = 85.7143 seconds per mile.
1 mile = 5280 feet