I work that out as 4.87 seconds.
4,000 feet
Impossible to answer ! Feet is a measurement of distance - seconds is a measurement of time (or angles)
40 feet in 0.5 seconds = 80 feet in 1 sec ie 80 fps
"Mean" means "average"= (distance covered) divided by (time to cover the distance)= ( 90 / 4.5 ) = 20 feet per second
1531.2
5 miles/hour (5280 feet/1 mile)(1 hour/3600 seconds) = 7.3 feet per second ================
4,000 feet
You cannot; feet are a measure of DISTANCE while seconds are a measure of TIME.
Impossible to answer ! Feet is a measurement of distance - seconds is a measurement of time (or angles)
It is a lot easier to count seconds than to estimate distance.
At 30 mph, an object travels about 102.66 feet in two seconds.
distance travelled (feet) / time taken (seconds)
1,531.2 feet in 12 seconds @87 mph
40 feet in 0.5 seconds = 80 feet in 1 sec ie 80 fps
1,531.2 feet in 12 seconds at 87 mph.
"Mean" means "average"= (distance covered) divided by (time to cover the distance)= ( 90 / 4.5 ) = 20 feet per second
87 miles per hour = 1,531.2 feet per 12 seconds