Divide the distance by the speed. The result will be in hours in this case.
Speed is a representation of distance travelled over time. Thus to find the time taken you need to divide the distance travelled by the speed. In this case we would need to do 1000 miles divided by 70mph: 1000/70 = approximately 14.2857 Therefore, it would take roughly 14 and a quarter hours to travel 1000 miles at 70mph.
on average they can travel 1000 miles....in one day
2 million
You are travelling at 100 mph. That means (obviously) that every hour you travel 100 miles, and since 1000/100=10 you will be travelling for ten hours.
Previously Written:: "WRONG!the average speed is the mean..or (1000 + 1) divided by 21000 miles an hour"Edited Answer::You have to take in to account of not only the speed but the distance traveled in the time allotted. You cover two miles in 3603.6 Seconds or roughly one mile per 1,801.8 Seconds which translates into 30.03 Minutes.2 Miles in 60.06 Minutes translates to traveling a little bit slower than 2 miles per hour.The only way to get the average speed or the mean is if you traveled at 1000 MPH for an hour, then traveled another hour at 1 MPH. You would have traveled 1001 Miles which then you could divide that by how many hours you drove. Then your average speed would be 500.5 MPH.It's a trick question, because you only travel one mile for 1000 MPH. Which makes the mile traveled in 3.6 Seconds. Hardly enough for an average speed of anything more than 2 Miles per hour.
1000 minutes
Speed is a representation of distance travelled over time. Thus to find the time taken you need to divide the distance travelled by the speed. In this case we would need to do 1000 miles divided by 70mph: 1000/70 = approximately 14.2857 Therefore, it would take roughly 14 and a quarter hours to travel 1000 miles at 70mph.
on average they can travel 1000 miles....in one day
The sun is moving through the galaxy at about a half a million miles per hour. Earth orbits the sun at about an eighth of that speed and the Milkyway Galaxy is moving at about a million miles per hour. Earths rotational speed is comparatively irrelevant. (about 1000 miles per hour) So earth is moving between .5 and 1.5 million miles per hour or .2% to .5% of the speed of light
2 million
1000 speed an hour is not a term used to calculate speed , please rephrase as Kilometers or Miles per hour
1000 miles an hour
You are travelling at 100 mph. That means (obviously) that every hour you travel 100 miles, and since 1000/100=10 you will be travelling for ten hours.
Time = Distance/Speed = 1000 miles /70 mph = 14.286 hours
The orbital speed of the Earth around the Sun averages about 108,000 km/h. The planet travels a total of 940 million kilometers in one revolution.
Previously Written:: "WRONG!the average speed is the mean..or (1000 + 1) divided by 21000 miles an hour"Edited Answer::You have to take in to account of not only the speed but the distance traveled in the time allotted. You cover two miles in 3603.6 Seconds or roughly one mile per 1,801.8 Seconds which translates into 30.03 Minutes.2 Miles in 60.06 Minutes translates to traveling a little bit slower than 2 miles per hour.The only way to get the average speed or the mean is if you traveled at 1000 MPH for an hour, then traveled another hour at 1 MPH. You would have traveled 1001 Miles which then you could divide that by how many hours you drove. Then your average speed would be 500.5 MPH.It's a trick question, because you only travel one mile for 1000 MPH. Which makes the mile traveled in 3.6 Seconds. Hardly enough for an average speed of anything more than 2 Miles per hour.
wrong! The Flash can actually go the speed of light which is 670,000,000 mph and not mach 1000 which was the old answer and just to be clear mach 1000 is 670,000,0 mph.