While this seems straightforward by taking the average between 30 and 40 (35 mph), the answer is more complex, because LESS TIME is spent driving at the FASTER speed than at the SLOWER speed.
Using only Englishmeasurements, the average speed is computed as the formula Va= (d1+d2)/ (t1 + t2):
1 mile at 30 mph - time is 1/30 hour or 2 minutes.
1 mile at 40 mph - time is 1/40 hour or 1.5 minutes.
Total distance traveled = 2 miles
Total time elapsed = 3.5 minutes
Average speed = 2 miles / (3.5 minutes) = .5714 miles per minute = 34.3 mph
--------
If the question had been phrased as "one MINUTE at 30 mph and one MINUTE at 40 mph", then the answer would indeed be the obvious average speed of 35 mph.
A 4-minute mile means an average speed of 15 miles per hour.
This is an average speed of 4.3 miles per hour.
177
192.307692 mph
5 mph (miles per hour)
This is an average speed of 28.8 mph
Yes, you can. If the average speed for the first mile is 30 mph, and the average speed for the second mile is 90 mph, then 30+90=120/2=60. You need to solve the equation for average speed, not time. You can't. 1 mile @ 30mph = 2 minutes 60 mph for 2 miles = 2 mintes thererfore you cannot go fast enough (or you could go infinity mph)
A 4-minute mile means an average speed of 15 miles per hour.
This is an average speed of 4.3 miles per hour.
The average speed is determined only by the total distance traveled and the total time. Thus, the average speed is: (1+10+60) / 6 = 11.83 mph
Jenna's average speed was 8 miles per hour.
A 15-minute mile equates to an average speed of 0.067 miles per minute.
1 hour = 60 minutes60 miles / hour = 60 miles / 60 minutes = 1 mile / minute375 miles ---> 375 minutes = 6hours15minutes
A 14-minute mile equates to an average speed of 4.3 miles per hour.
This is an average speed of 4.77 miles per hour.
The distance of a 139 mile drive is 139 miles. At an average 70 mph, it will take you around two hours.
177