Best Answer

To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours.

150 / 2.5 = 60

So Jordan drove an average of 60 miles per hour.

Q: Jordan drove from his home to his new college He drove 150 miles in 2.5 hours What was his average speed in miles per hour?

Write your answer...

Submit

Still have questions?

Continue Learning about Math & Arithmetic

441/7 = 63 mph

Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour

Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph

17000 miles/317 days X 1 day/24 hours=2.2345 miles per hour average. This is if you drive every hour of every day. If you drove 8 hours then stopped then 16 hours later drove another 8 hours your average would be the same but the speed would be different. 17000 miles/317 days X1 day/8 hours=6.7 miles per hour

Your average speed was 59.3 miles per hour.

Related questions

60 miles

60 mph

60

60

That would depend on the average speed. If the average is 50 mph, they drove 650 miles.

52 m.p.h.

25

He drove 180 miles in 5 hours for an average speed of 36 miles per hour.

300 MILES

441/7 = 63 mph

441/7 = 63 mph

Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour