Suppose you need to round to n digits.
Look at the (n+1)th digit.
If it is 0 you do not need to round. You simply delete all digits after the nth.
If it is 1, 2, 3 or 4 you again delete all digits after it.
If it is 6, 7, 8 or 9 you increase the nth digit by 1 and delete all subsequent digits.
If it is 5, then
if n is even you delete all subsequent digits,
if the nth digit is odd you add one and delete all subsequent digits.
Statistically naive people recommend that 5 is always rounded up without realising that this introduced an upward bias.
Chat with our AI personalities
Look to the right of the digit you are rounding.
If the next digit (and all that follow) is 0, there is no issue.
If the next digit is 1, 2, 3 or 4 then ignore them.
If the next digit is 6, 7, 8, or 9 then add one to the digit that you are rounding.
If the next digit is 5 you can go either way. So as not to introduce a bias, you should round up half the time and round dwon half the time. The conventional solution, used by many statisticians, is to round up or down so that the previous digit is (or becomes) even.
by rounding off the numbers
The answer will depend on the degree of rounding. To the nearest ten, it is 39120 To the nearest million, it is 0.
There is no universal "better". Rounding off is a trade-off between reducing the accuracy and simplifying calculations. Also, if there are other numbers in an addition that are rounded to the nearest hundred, there is no point in rounding your number to the nearest ten.
Rounding a numerical value means replacing it by another value that is approximately equal but has a shorter, simpler, or more explicit representation.
You can not add irrational numbers. You can round off irrational numbers and then add them but in the process of rounding off the numbers, you make them rational. Then the sum becomes rational.