In 0.7 hour Matt drives 35 miles In 1 hour Matt drives 35/0.7 =50 miles His average speed is 50mph.
53
Your average speed is 10 miles per hour.
If someone runs 10 miles in 90 minutes their average speed is 6.67 mph
Three miles in 45 minutes is an average speed of 4 miles per hour or 5.87 feet per second.
40 miles.
In 0.7 hour Matt drives 35 miles In 1 hour Matt drives 35/0.7 =50 miles His average speed is 50mph.
53
If you run 2.5 miles in 20 minutes your average speed is 7.5 miles per hour.
Your average speed is 44.3 miles per hour.
Your average speed is 11.25 miles per hour.
Your average speed is 10 miles per hour.
If someone runs 10 miles in 90 minutes their average speed is 6.67 mph
P and Q are 12 miles far from each other. Given that Patrick drives from P to Q at an average speed of 40 mph Also he drives from Q to P at an average speed of 45 mph Let "s" be the distance between P and Q in miles. Let "t" be the time in minutes There are 60 minutes in 1 hours By the definition pf average speed we can write that The speed is given in two situations in miles per hour Case 1 When average speed = 40 mph Case 2 When average speed = 45 mph Given that Patrick takes two minutes less Solving for "s" from equations (1) and (2) gives us So we can conclude that P and Q are 12 miles far from each other.
Three miles in 45 minutes is an average speed of 4 miles per hour or 5.87 feet per second.
This is an average speed of 22.06 miles per hour.
Her average speed was 58 miles per hour.