79.2 mph
60/25=2.4
33x2.4=79.2
The jogger's average speed is 6 miles per hour.
(.25 miles / 3 minutes) * (60 minutes / 1 hr) = (.25 * 60)/3 = 5 mph If you run a quarter mile in 3 minutes, your average speed is 5 mph.
This is an average speed of 28.8 mph
To determine the number of miles traveled in 25 minutes, you need to know the speed at which the distance is covered. If you are traveling at a constant speed of 60 miles per hour, then in 25 minutes you would cover 25/60 = 0.4167 hours. Multiplying this by the speed of 60 miles per hour gives you 25 miles. Therefore, if you are traveling at 60 miles per hour, you would cover 25 miles in 25 minutes.
If you traveled 25 miles in 2 minutes, your average speed was 25 miles per 2 minutes or 750 miles per hour. If your rate of deceleration was constant, your initial speed was two times 750 miles per hour or 1500 mph. I do not have enough information to determine your initial velocity because I don't know what direction you were going, and velocity is speed with direction.
At 60 MPH average, about 25 minutes.
If you went at a speed of a mile a minute, or 60 mph, you would travel 25 miles in 25 minutes. For 10 miles in this time you only have to travel at 2/5 of that speed, or 24 mph.
At 60 miles per hour probably around 25 minutes I think 😐
25/5 = 5 miles per hour
If you maintained an average speed of 25 mph, the time to drive 13 miles would be about 31 minutes. You would have to add time for stops, detours and other delays.
The fish travelled 15 miles in 52 minutes.
Average speed = distance/time = 100/4 miles per hour = 25 mph.