441/7 = 63 mph
To find Jordan's average speed, divide the total distance by the total time. He drove 150 miles in 2.5 hours, so his average speed is 150 miles ÷ 2.5 hours = 60 miles per hour. Therefore, Jordan's average speed was 60 mph.
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
Your average speed was 59.3 miles per hour.
An average speed of 40 miles per hour.
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
He drove 180 miles in 5 hours for an average speed of 36 miles per hour.
To find Jordan's average speed, divide the total distance by the total time. He drove 150 miles in 2.5 hours, so his average speed is 150 miles ÷ 2.5 hours = 60 miles per hour. Therefore, Jordan's average speed was 60 mph.
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
300 MILES
25
52 m.p.h.
60
60
50MPH (Miles Per Hour)
60 miles
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
Your average speed was 59.3 miles per hour.