Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour
36 mph
44 mph
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
Your average speed was 59.3 miles per hour.
He drove 180 miles in 5 hours for an average speed of 36 miles per hour.
100 miles
44 mph
36 mph
180 miles / 5 hours = 36 miles per hour (average)
180 miles, 5 hours; 36 mph
36 mph
36 miles per hour
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
(80 + 100) / (2 + 3) = 180/5 = 36 miles per hour.
60 miles
300 MILES