It depends on the person's height. Or, more precisely, the distance between the ground and the center of the eyeball. But assuming the earth were a perfect sphere: Average radius of earth = 6,372,797 meters Draw a circle representing the earth. Then draw a stick figure standing on the earth - it doesn't have to be to scale. Draw a line from the stick figure's head to form a tangent to the circle. Call this "line A". This line is the "line of sight", and the point at which it touches the circle is the farthest the person can see. Now draw a line from the center of the circle to the point where the line of sight touches the circle. Call this "line B". It should form a right angle (90 degrees) with line A. Now draw a third line from the center of the circle to the head of the stick figure. Call this "line C". Lines A, B, and C now form a "right triangle", because of the right angle at the intersection of A and B. Line C, because it is opposite the right angle, is called the hypotenuse. Now, you know the length of line B (the radius of earth, 6,372,797 meters). And once you make an assumption about the height of the stick figure (say, 2 meters), you also know the length of line C (6,372,799 meters). And using trigonometry, you can figure out the measurement of the angle between lines B and C. The cosine of an angle is the measurement of the line between the angle in question and the right angle; divided by the length of the hypotenuse. The "adjacent" line is line B, and the hypotenuse is line C. For this angle, that ratio, the cosine, is 6,372,797 / 6,372,799, or 0.9999996. From a trigonometric table (or a scientific calculator), you can discover that the angle that corresponds with this cosine value is approximately 0.0454 degrees. Since there are 360 degrees in a circle, this angle is about is about 1/7930 of the circumference of the circle. And the circle in question, the earth, has an average circumference of 40,041,470 meters. Dividing by 7930 results in 5048.88 meters, or a little over 5 kilometers. Remember, however, that this is based on an assumed height of 2 meters, which is very tall for a human. A shorter person would not be able to see as far; a taller person could see farther. But there are diminishing returns. For example, a 4 meter-tall person can't see twice as far as a 2-meter-tall person, but a 1-meter-tall person can see more than half as far as a 2-meter-tall person. The foregoing explanation of the Pythagorean theorem used to find the unknown side of a triangle (A2+B2+C2) works fine, but is rather cumbersome when two of the sides are measured from the center of the earth. A simpler formula is to multiply 1.17 times the square root of the height of the observation point (eyeball). So, if a person's eye is 6 feet above the surface of the earth, multiply the square root of 6 (2.45) times 1.17 to find that the object you're looking at is about 2.87 nautical miles away. (for statute miles, multiply your answer by 1.15) This "simpler" formula simply doesn't work. First of all, it's an approximation at best, and the further you get away from a median height, the more erroneous it gets. Second, it only works with English units. It won't work with metric. For that matter, it doesn't work with anything other than feet. Well, you could make it work with other units, if someone did the math and figured out what the constant would have to be in each case. But "the math" in this case, is the "cumbersome" method that I outlined in the first answer to this question. And, by the way, my method has nothing to do with the Pythagorean theorem. But now that you mention it, the Pythagorean theorem is much easier (and only slightly less precise) than my method. It is also easier (and much more precise) than your method. And, of course, it works with metric units. AND you don't have to remember the constants, just the formula. A-squared plus B-squared equals C-squared. Where A is the diameter of earth, C is the diameter of earth plus the height of the observer, and solve for B. If you can't handle the big numbers, buy a calculator.
Chat with our AI personalities
You need to get to a sufficiently high altitude or distance from the Earth to being to see the curvature. A minimum heihgt of around 60 to 70,000 ft is required to be able the see the curvature of the horizon.
If they are relatively near so that the curvature of the earth has negligible effect.
No. Level flight for an aircraft is not flight in a straight line (vertically) but one that follows the curvature of the earth. It is an arc that maintains the same altitude.
Two 48 ft tall people can walk about 16.97 miles from each other before they can no longer see each other due to curvature of the Earth.Build a right triangle. Base = 3960 miles. Right angle at surface of Earth. Height distance from one observer to the horizon. Hypotenuse = 3960 miles + 48 feet. Angle at the center of the Earth is theta.Cosine theta is adjacent over hypotenuse, or 3960 / 3960.0090909... Solve for inverse cosine and you get theta = 0.12277 degrees. Multiply that by 2 pi R (r=3960) and divide by 360 and you get 8.49 miles. Double that and you get 16.97 miles.Note: The height of the triangle does not matter. What matters is theta and its proportion to the circumference of the Earth.
At the earth's surface, a block with a force of 10 newtons has a mass of 1.02kg