Previously Written:: "WRONG!
the average speed is the mean
..or (1000 + 1) divided by 2
1000 miles an hour"
Edited Answer::
You have to take in to account of not only the speed but the distance traveled in the time allotted. You cover two miles in 3603.6 Seconds or roughly one mile per 1,801.8 Seconds which translates into 30.03 Minutes.
2 Miles in 60.06 Minutes translates to traveling a little bit slower than 2 miles per hour.
The only way to get the average speed or the mean is if you traveled at 1000 MPH for an hour, then traveled another hour at 1 MPH. You would have traveled 1001 Miles which then you could divide that by how many hours you drove. Then your average speed would be 500.5 MPH.
It's a trick question, because you only travel one mile for 1000 MPH. Which makes the mile traveled in 3.6 Seconds. Hardly enough for an average speed of anything more than 2 Miles per hour.
Divide the distance by the speed.
60 mph
An average speed of 40 miles per hour.
Depends on the average speed travelled. At 56 miles per hour average it will take 1 hour. The distance needs to be divided by the average speed to obtain the answer.
If you travel 160 miles in 3 hours, your average speed is 53 and one-third miles per hour.
Your average speed is 44.3 miles per hour.
The average travel speed of a wolf is 5 miles per hour
The average speed is 62.5 miles per hour(mph)
an average hamster can travel at a speed of 5 miles per hour
The average speed is 0.0069 miles per second.
this depends on how fast your average speed is 300 miles / average speed in mph = how many hours you will travel
The number of hours it will take to travel 308 miles is(308)/(your average speed, in miles per hour)
Depends on average speed.
It will depend on the average speed at which you travel.
Divide the distance by the speed.
180 mph
288/6 x 8 = 384 Therefore, the car will travel 384 miles, assuming the same average speed.