180 miles (60 x 3 = 180).
If you drove 35 mph for 2 hours, you would drive 70 miles. 35 X 2 = 70.
(4 hour) x (70 mi/hr) = 280 miles
The answer is in the "Miles per hour" (Mph) you drove. If you drove 40 Mph, then in one hour you would have driven 40 miles. Therefore, in five hours, at the constant speed of 40 Mph, you would have drive 200 miles.
1.6428571 hours
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
He would travel 3/5 of that or 165 miles.
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
If you drove 35 mph for 2 hours, you would drive 70 miles. 35 X 2 = 70.
332.5 miles
(4 hour) x (70 mi/hr) = 280 miles
The answer is in the "Miles per hour" (Mph) you drove. If you drove 40 Mph, then in one hour you would have driven 40 miles. Therefore, in five hours, at the constant speed of 40 Mph, you would have drive 200 miles.
1.6428571 hours
To calculate the time it takes to travel a certain distance at a given speed, you can use the formula: time = distance / speed. In this case, the time it takes to travel 375 miles at 65 miles per hour would be 375 miles / 65 miles per hour = 5.77 hours. Therefore, it would take approximately 5.77 hours to travel 375 miles at 65 miles per hour.
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
If you mean that travelling at 26.6 miles per hour, how long you would travel in 4 hours, you would travel 106.4 miles in 4 hours.
4 hours 25 minutes !
Multiplying time by speed gives a total distance of 8 x 40.8 = 326.4 miles.