It was: 200/4 = 50 mph
Chat with our AI personalities
Stacy's average speed was 50 miles per hour. To calculate average speed, you divide the total distance traveled (200 miles) by the total time taken (4 hours).
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour
Your average speed was 59.3 miles per hour.
441/7 = 63 mph