There is no universal "better". Rounding off is a trade-off between reducing the accuracy and simplifying calculations. Also, if there are other numbers in an addition that are rounded to the nearest hundred, there is no point in rounding your number to the nearest ten.
In whole numbers, rounding to the nearest ten is better. And in decimals, rounding to the nearest hundreth is more accurate.
To the nearest hundred, 987400
If you are rounding to the nearest one, it is 999.5 . If you are rounding to the nearest ten, it is 995. If you are rounding to the nearest hundred, it is 950. If you are rounding to the nearest thousand, it is 500.
-- If rounding to the nearest 300, then 749 is. -- If rounding to the nearest 200, then 699 is. -- If rounding to the nearest 150, then 674 is. -- If rounding to the nearest hundred, then 649 is. -- If rounding to the nearest twenty, then 609 is. -- If rounding to the nearest ten, then 604 is.
213
It depends what you're rounding off to. 9050 rounded to the nearest hundred, would be 9100. 9050 rounded to the nearest thousand, would be 9000.
If you are rounding it off to the nearest tens place its 320 if its to the nearest hundred than its 300
27222722
The answer will depend on the degree of rounding: to the nearest ten, hundred, thousand?
Im not sure actually. But rounding to the nearest number means its easier probably.
The number 486 rounded off to the nearest whole number is 486. When rounded to the nearest ten, it is 490. Rounding to the nearest hundred gives 500, and rounding to the nearest thousand results in 1000.