Depends on the height of the aircraft above ground.
The altitude of aircraft is measured above the ground, not above the horizon, and it's a distance. The altitude of the sun is not measured above the ground, and it's not a distance. If it were, it would always be some number near 93 million miles. The altitude of the sun is the angle that an observer sees between his horizon and the sun, and it's different for different observers in different places.
A very rough or "ball park" figure is about 20 miles. Comment: It depends on your height above the water. You could only see 20 miles if you were well above sea level. If you are standing more or less at sea level, with your eyes just a few feet above the water level, you could see only about 3 or 4 miles. For example: Eye level at 6 feet: horizon at about 3 miles. Eye level at 24 feet: horizon at about 6 miles. Eye level at 96 feet: horizon at about 12 miles. Eye level at 270 feet : horizon at about 20 miles.
It depends on the mass of the black hole. The size of the event horizon is directly proportional to mass. Most black holes are what we call "stellar mass" black holes which range from about 3 times to 30 times the mass of the sun. The event horizon of a 30 solar mass black hole would be about 110 miles in diameter. Earth, by comparison, is just over 7,900 miles in diameter. An intermediate mass black hole about 1,340 times the mass of the sun would have an event horizon about the same size as Earth. Astronomers have detected supermassive black holes up to 12 billion times the mass of the sun. Such a black hole would have an event horizon 44 billion miles across, or about 5 times larger than the orbit of Pluto.
Ordinarily, the farthest visible big wave might be half a kilometer to a kilometer away. On a calm day, someone with eyes two meters above the water could see water out to the horizon, where his vision would be tangent to the Earths surface and at right angle to a radius. This makes a right angle with two sides of about 6371 kilometers and .002 kilometers greater. Using Pythagorus, horizon^2 + 6371^2 = 6371.002^2 so horizon = 5.048 km. There is a far simpler equation to arrive at a solution in miles, this it the square root of 1.5 times the height of a human above sea level (in feet), using the sum above, 2 meters (6'6") * 1.5= 9'9", the square root of which is 3.146 = 3.146 miles, 5.048Km = 3.136 miles, taking in to account that's the height of his eyes not his head, i would say this equation would suffice and is far simpler and easier to remember.
The average pace for a human on level ground is 4 miles per hour. So one hour and fifteen minutes.
The altitude of aircraft is measured above the ground, not above the horizon, and it's a distance. The altitude of the sun is not measured above the ground, and it's not a distance. If it were, it would always be some number near 93 million miles. The altitude of the sun is the angle that an observer sees between his horizon and the sun, and it's different for different observers in different places.
No. Not with the naked eye. That kind of feat requires high grade optics (with the possible exception of an aircraft leaving a conspicuous contrail).
When the aircraft is on the ground it is about 3 metres. When it is in the air it can be up to 5 nautical miles and 3000 feet
multiply the number of minutes that the aircraft is visible to you by 10 which gives you the amount of miles travelled.
Miles Aircraft was created in 193#.
If you were on the water with an unobstructed view, you would be able to see 2.692 miles or 2.338 nautical miles.
The sight line to the point where the aircraft touches the horizon is a tangent to the surface of the earth. This forms the second leg of a right angled triangle with the radius of the earth as the other leg and the radius of the earth plus the height of the plane above the surface of the earth as the hypotenuse. There are two assumptions that are going to have to be made about the observation made of the aircraft, both of which will make small adjustments to the real distance, but with the accuracy of the figures used, they will be insignificant: 1) The assumption that the earth is spherical; it isn't, it's geoid shaped - a sphere flattened at the poles and bulging around the equator; 2) The observer is looking along the surface of the earth (most likely out to sea) at mean sea level and not some 5-6 ft above it A third assumption is that the aircraft is flying parallel to the curvature of the earth at a steady altitude of 36,000 ft above mean sea level (known as horizontal flight). The radius of the earth is about 3959 miles. 36000 ft = 36000 ÷ 5280 miles ≈ 6.8 miles The distance to the aircraft can then be calculated using Pythagoras: distance_to_aircraft ≈ √(3965.8² - 3959²) ≈ 232 miles. Along the surface of the earth (to the point directly under the aircraft) it is slightly less, but still approx 232 miles. (it can be calculated by 3959 × arccos(3959/3965.8) with the inverse cosine calculated in radians not degrees.)
It depends on the type of aircraft. But just for reference, the largest commercial aircraft in the world currently, (Airbus 380) has a range of 830 nautical miles (15,400 KM)
At 100ft, the horizon is approx 12 miles away.
If an aircraft travels 120 miles in 12 minutes how fast is the aircraft going? The aircraft travels 120 miles in 12 minutes, which is 1/5 of an hour. Therefore, in 55 (one) hour, it would travel 5 × 120, or 600 miles. The aircraft is traveling 600 miles per hour.
41.15 miles
A very rough or "ball park" figure is about 20 miles. Comment: It depends on your height above the water. You could only see 20 miles if you were well above sea level. If you are standing more or less at sea level, with your eyes just a few feet above the water level, you could see only about 3 or 4 miles. For example: Eye level at 6 feet: horizon at about 3 miles. Eye level at 24 feet: horizon at about 6 miles. Eye level at 96 feet: horizon at about 12 miles. Eye level at 270 feet : horizon at about 20 miles.