Because many square roots are Irrational Numbers that can not be computed to an exact value for lack of space to continue an infinite string of digits after the decimal point.
Chat with our AI personalities
Well, it's both: you're using a machine to compute an approximation. Why isn't it exact? Most square roots (such as the square root of two) are irrational numbers, so their decimal representation requires an infinite number of digits. We humans have to have finite answers, hence we round off.
There are many ways to do it. Have a look at the Wikipedia article (check the link).
See the related link for a detailed description of a manual method for calculating square roots.
You can approximate a square root as a decimal or fraction. If you want the exact number, you have to leave it with the square root sign.
In surd form, square roots need to be have the same radical term before you can add or subtract them. However, unlike in algebraic expressions, it is possible to add or subtract square roots using approximate (decimal) values.