6 mph, six miles per hour.
speed = distance/time = 2.8mi/30min = 0.09mi/min
To find out how long it takes a cat to run 1.5 miles at a speed of 30 miles per hour, you can use the formula: time = distance ÷ speed. Plugging in the values, time = 1.5 miles ÷ 30 miles per hour, which equals 0.05 hours. Converting this to minutes, 0.05 hours × 60 minutes/hour = 3 minutes. So, it would take a cat 3 minutes to run 1.5 miles.
1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.
To find your average speed, divide the total distance by the total time. You ran 30 miles in 90 minutes, which is 1.5 hours (since 90 minutes is 1.5 hours). Therefore, your average speed is 30 miles ÷ 1.5 hours = 20 miles per hour.
60mph (60 miles per hour)
It is: 4.8 mph
3.8 mph
speed = distance/time = 2.8mi/30min = 0.09mi/min
100 miles per hour
3 hours and 30 minutes = 3*60+30 = 210 minutes The planes is therefore going 1000/210 = 4.761904761904762 miles in a minute This means that the plane is going at a speed of 4.761904761904762*60 = 285.7142857142857 miles an hour.
To drive 12 miles at a constant speed of 30 miles per hour, it will take about 24 minutes. Because 30 miles per hour is half as fast, it will take twice as long, at a rate of 2 minutes per mile.
20 mph. At that speed, in one hour you would travel 20 miles. After half an hour of traveling at that speed, you would have gone 10 miles.
The distance traveled in 30 minutes of driving depends on the speed of the car. If you are traveling at 60 miles per hour, you would cover 30 miles in 30 minutes.
It travels 288 miles in 4 hours 30 minutes at that speed.
1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.
To find your average speed, divide the total distance by the total time. You ran 30 miles in 90 minutes, which is 1.5 hours (since 90 minutes is 1.5 hours). Therefore, your average speed is 30 miles ÷ 1.5 hours = 20 miles per hour.
60mph (60 miles per hour)