The easiest wau to solve this is to think of a number that both 5 and 10 go into. Say 10.
Suppose each half of the race was 10 miles long.
Then the first half, at 5 mph, takes 2 hours. The second half, at 10 mph takes one hour. Than makes 3 hours in all to run the 10+10 = 20 miles.
So average speed = 20/3 = 6.67 mph.
The required answer is known as the Harmonic mean of the two speeds.
An average deer can run up to 40 miles. :)
Three miles in 45 minutes is an average speed of 4 miles per hour or 5.87 feet per second.
Shayla's speed for the last half-mile was approximately 0.002 miles per second.
No, the speed of light is 186,282.4 miles per second. The speed of sound at sea level is about 0.2114 miles per second.
-- 3 miles per second -- 10,800 miles per hour -- 29,030,400 furlongs per fortnight
The average speed is 0.0069 miles per second.
100 miles/50 seconds= 2 miles per second average speed
An average deer can run up to 40 miles. :)
Planet Mars orbits the sun at an average velocity of 14.96 miles per second.
186,000 miles per second
Three miles in 45 minutes is an average speed of 4 miles per hour or 5.87 feet per second.
You would times by 3600.
To find Kira's average speed, you need to divide the total distance by the time taken: Speed = Distance/Time. In this case, Kira's average speed would be 72 miles divided by 36 seconds, which equals 2 miles per second.
Shayla's speed for the last half-mile was approximately 0.002 miles per second.
Hope speed is needed in miles per second. Its speed is 1,86,000 miles per second
No, the speed of light is 186,282.4 miles per second. The speed of sound at sea level is about 0.2114 miles per second.
The average speed over the first 30 miles is equal to the average speed over the next 15 miles. You cannot guarantee that the speed was constant inbetween, though. The average speeds are 45 MPH for each leg.