Best Answer

Previously Written:: "WRONG!

the average speed is the mean
..or (1000 + 1) divided by 2

1000 miles an hour"

Edited Answer::

You have to take in to account of not only the speed but the distance traveled in the time allotted. You cover two miles in 3603.6 Seconds or roughly one mile per 1,801.8 Seconds which translates into 30.03 Minutes.

2 Miles in 60.06 Minutes translates to traveling a little bit slower than 2 miles per hour.

The only way to get the average speed or the mean is if you traveled at 1000 MPH for an hour, then traveled another hour at 1 MPH. You would have traveled 1001 Miles which then you could divide that by how many hours you drove. Then your average speed would be 500.5 MPH.

It's a trick question, because you only travel one mile for 1000 MPH. Which makes the mile traveled in 3.6 Seconds. Hardly enough for an average speed of anything more than 2 Miles per hour.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: If you travel 1 mile at 1000 miles per hour and then another mile at 1 mile per hour what is your average speed?
Write your answer...
Still have questions?
magnify glass