90 mph
90 mph
40mph
90 mph
Her average speed was 58 miles per hour.
If you drove 60 miles in 60 minutes, you would be going 60 miles per hour. If you drove 60 miles in 30 minutes, you would be going 120 miles per hour. As travel time decreases, speed increases.
Janice was driving 60 miles per hour.
Well, if you drove for 60 minutes (one hour) at that speed, you would travel 128 miles. But you only traveled for 15 minutes- one fourth of an hour. Divide 128 by 4, and that is your answer.
To calculate the average rate of speed, we need to find the total distance and divide it by the total time. In this case, the total distance is 5 miles + 10 miles + 35 miles = 50 miles. Since the time taken is 40 minutes, we convert it to hours by dividing by 60: 40 minutes ÷ 60 = 2/3 hours. Therefore, the average rate of speed is 50 miles ÷ 2/3 hours = 75 miles per hour.
Janice was driving at 60 miles per hour. The way you figure this is to first look at Tom's speed. He was doing 90 mph for 20 minutes, so when he caught Janice, he had gone 1/3rd of an hour, or 1/3rd of his speed, or 30 miles. Janice had been driving for 10 minutes before Tom left, so she had driven 30 minutes total, and had gone 30 miles. Janice was traveling at a mile a minute, or 60 miles an hour. Good luck with your homework.
Time = Distance/Speed So time = 150 miles / 45 miles per hour = 3.33.. hours = 3 hours 20 minutes.
60 miles
30 milesif she was driving at a rate of 40 miles each hour, and she drove for 3/4 of an hour (45 minutes), then she would have driven 3/4 of 40 miles, which is 30 miles.