Average speed is usually a unit rate. So, if you travel 350 miles in 1 hour then that is said"350 miles per hour."
The average speed would be 2.2 miles per hour.
5530/790=7 hours
How much time would it take for the sound of thunder to travel 1.500 meters if sound travels at a speed of 330 m/sec
In space..
The average speed would be 30 km/h Speed is measured in length per time (for example km/h), so to calculate the average speed you divide 75 km by 2.5 hours. 75km/2.5hours - which would be 75/2.5 km/h = 30 km/h.
How many miles traveled divided by the time it took to travel. Like if you travel 60 miles in 60 minutes Your average speed would be 1 mph
The vehicle would be traveling at 125 mph.
No. The average speed would be the distance traveled divided by the elapsed time.
7 x 65, must be around 450 miles...
Average Speed = (distance traveled) / (total time), Total time = (distance traveled) / (average speed). So, to calculate time taken, you need to know the speed and the distance. Note that average speed is a quantity that is normally calculated based on the distance traveled and the time taken. If you slow down and speed up, these factors affect your average speed over a given distance. But if you were already traveling at a given speed and stay constant (say you're on a flat straight road and you're traveling at 50 miles per hour with the cruise control set), then your average speed would be that constant speed. So, in the example, if you were going 50 MPH, then: Total time = (distance traveled) / (average speed) = 0.34 miles / (50 miles/hour) = 0.0068 hr = 0.408 minutes = 24.48 seconds (note this is only for the example speed of 50 MPH - plug in your own speed to obtain the result that you are looking for)
to find average speed divide the distance traveled by the time it took to travel that distance so for example: what is the average speed of a runner that traveled 10 miles in 2 hours? 10(distance)/2(time) = 5(speed), so the answer would be 5 miles per hour although they may not have been traveling 5 miles per hour the whole time the average speed they were traveling is 5 miles per hour. also you can find the distance by multiplying the time by the speed. and to find time, multiply speed by distance. (dont forget units will vary for answers) hope this helped :)
Previously Written:: "WRONG!the average speed is the mean..or (1000 + 1) divided by 21000 miles an hour"Edited Answer::You have to take in to account of not only the speed but the distance traveled in the time allotted. You cover two miles in 3603.6 Seconds or roughly one mile per 1,801.8 Seconds which translates into 30.03 Minutes.2 Miles in 60.06 Minutes translates to traveling a little bit slower than 2 miles per hour.The only way to get the average speed or the mean is if you traveled at 1000 MPH for an hour, then traveled another hour at 1 MPH. You would have traveled 1001 Miles which then you could divide that by how many hours you drove. Then your average speed would be 500.5 MPH.It's a trick question, because you only travel one mile for 1000 MPH. Which makes the mile traveled in 3.6 Seconds. Hardly enough for an average speed of anything more than 2 Miles per hour.
70 miles per day.
The average speed would be 2.2 miles per hour.
5530/790=7 hours
To travel 145 miles in 3.5 hours requires an average speed of 145/3.5 mph. The distance traveled at this speed for 5 hours would be approximately 207 miles (5 x 145/3.5 miles).
How much time would it take for the sound of thunder to travel 1.500 meters if sound travels at a speed of 330 m/sec