During the first hour, they covered (60 x 1) = 60 miles.
During the next 2 hours, they covered (45 x 2) = 90 miles.
The total distance covered was (60 + 90) = 150 miles.
The average rate was (distance/time) = (150/3) = 50 miles per hour.
441/7 = 63 mph
Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
17000 miles/317 days X 1 day/24 hours=2.2345 miles per hour average. This is if you drive every hour of every day. If you drove 8 hours then stopped then 16 hours later drove another 8 hours your average would be the same but the speed would be different. 17000 miles/317 days X1 day/8 hours=6.7 miles per hour
676/52 = 13 hours
60
60
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
25
52 m.p.h.
He drove 180 miles in 5 hours for an average speed of 36 miles per hour.
60 mph
300 MILES
441/7 = 63 mph
441/7 = 63 mph
Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour