6 mph, six miles per hour.
speed = distance/time = 2.8mi/30min = 0.09mi/min
1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.
To find your average speed, divide the total distance by the total time. You ran 30 miles in 90 minutes, which is 1.5 hours (since 90 minutes is 1.5 hours). Therefore, your average speed is 30 miles ÷ 1.5 hours = 20 miles per hour.
60mph (60 miles per hour)
To determine how many miles Sarah can jog in 30 minutes, we need to know her jogging speed. For example, if she jogs at a pace of 6 miles per hour, she would cover 3 miles in 30 minutes. Without her specific speed, it's impossible to give an accurate answer.
It is: 4.8 mph
3.8 mph
speed = distance/time = 2.8mi/30min = 0.09mi/min
100 miles per hour
3 hours and 30 minutes = 3*60+30 = 210 minutes The planes is therefore going 1000/210 = 4.761904761904762 miles in a minute This means that the plane is going at a speed of 4.761904761904762*60 = 285.7142857142857 miles an hour.
To drive 12 miles at a constant speed of 30 miles per hour, it will take about 24 minutes. Because 30 miles per hour is half as fast, it will take twice as long, at a rate of 2 minutes per mile.
20 mph. At that speed, in one hour you would travel 20 miles. After half an hour of traveling at that speed, you would have gone 10 miles.
The distance traveled in 30 minutes of driving depends on the speed of the car. If you are traveling at 60 miles per hour, you would cover 30 miles in 30 minutes.
It travels 288 miles in 4 hours 30 minutes at that speed.
1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.1.5 miles in 30 minutes is equivalent to 3 miles in 60 mintes = 1 hour.So the average speed is 3 mph.
To find your average speed, divide the total distance by the total time. You ran 30 miles in 90 minutes, which is 1.5 hours (since 90 minutes is 1.5 hours). Therefore, your average speed is 30 miles ÷ 1.5 hours = 20 miles per hour.
60mph (60 miles per hour)