easy.
Divide 520 by 8 so...
520/8 = 65 miles/hr.
470 mph
750 miles and hours mate
The average speed if an airplane travels 1364 miles in 5.5 hours is 248 miles/hr.
If you average 4 miles per hour and manage to keep going for the whole 24 hours without stopping you would walk 92 miles, take the bus it's quicker.
3 hours and 30 minutes = 3*60+30 = 210 minutes The planes is therefore going 1000/210 = 4.761904761904762 miles in a minute This means that the plane is going at a speed of 4.761904761904762*60 = 285.7142857142857 miles an hour.
3.5 hours.
35/5 = 7 hours
470 mph
The average speed for the 3,000 miles is 66.7 miles per hour.
About sixty two miles an hour, average.
25/5 = 5 miles per hour
The average is 40 mph because if you go up a hill that is 60 miles long, it will take you two hours. And if you travel down 60 mph, it will take you one hour. Divide 120 miles (60 miles going up, 60 miles coming down) by three hours (two hours going up, two hours going down) and you get an average of 40 mph.
It would depend on how fast you're going--if you're going an average of 50 mph, then it would take approximately 26 hours for 1300 miles.
542 miles in 12 hours and 37 minutes equates to an average speed of 42.96 miles per hour.
11 hours, 51 minutes
about 10 mph
Going about 55 MPH (average), it is 5 hr 15 minutes.