Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
Your average speed was 59.3 miles per hour.
An average speed of 40 miles per hour.
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
Stacy's average speed was 50 miles per hour. To calculate average speed, you divide the total distance traveled (200 miles) by the total time taken (4 hours).
He drove 180 miles in 5 hours for an average speed of 36 miles per hour.
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
300 MILES
52 m.p.h.
25
60
60
50MPH (Miles Per Hour)
60 miles
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
Your average speed was 59.3 miles per hour.
441/7 = 63 mph