Best Answer

1 hour = 60 minutes

speed = distance ÷ time

= 1.15 ÷ 12 miles every minute

= 1.15 ÷ 12 x 60 miles per hour

= 5.75 mph

Q: What is the average speed if you run 1.15 miles in 12 minutes?

Write your answer...

Submit

Still have questions?

Continue Learning about Math & Arithmetic

If you traveled at a constant rate of 75 miles per hour for 115 miles, it would take you 1.533 hours or 1 hour and 32 minutes. 1.533 hours is equal to 5520 seconds.

If an amusement park has a top speed of 607,200 feet per hour, then its speed in miles per hour is 115.

divide 115 seconds by 60 so you would get -> * 115/60=1.92 minutes almost two minutes

An amusement park with a top speed of 607 200 feet per hour, can be said to have a speed 115 miles per hour.

Time = Distance/Speed = 120/115 = 1.04 hours, approx.Time = Distance/Speed = 120/115 = 1.04 hours, approx.Time = Distance/Speed = 120/115 = 1.04 hours, approx.Time = Distance/Speed = 120/115 = 1.04 hours, approx.

Related questions

230 mph !

if we have to go 115 miles,at speed 60 miles/h,we cross the 60 miles per an hour,so to know how long does it take to cross 115 miles,just 115 /60=1.91 ... so it is needed 1hour and 54 minutes to cross 115 miles.

2 hours 52.5 minutes.

Time = Distance/Average speed = 115/100 = 1.15 hours or 1 hour and 9 minutes.

11,000 miles = 58,080,000 feet 115 days = 165,600 minutes 95.65 miles per day = 350.72 feet per minute

One hour and 55 minutes. At 60 miles per hour you drive one mile per minute, so in this case, 115 minutes.

About 1 hour and 55 minutes.

About 45 miles.

1 hour 29.5 minutes.

Usain Bolt run 100 m (with a running start) with a speed of 41 km/h.

If you traveled at a constant rate of 75 miles per hour for 115 miles, it would take you 1.533 hours or 1 hour and 32 minutes. 1.533 hours is equal to 5520 seconds.

1 hour 32 minutes.