In whole numbers, rounding to the nearest ten is better. And in decimals, rounding to the nearest hundreth is more accurate.
There is no universal "better". Rounding off is a trade-off between reducing the accuracy and simplifying calculations. Also, if there are other numbers in an addition that are rounded to the nearest hundred, there is no point in rounding your number to the nearest ten.
To the nearest hundred, 987400
If you are rounding to the nearest one, it is 999.5 . If you are rounding to the nearest ten, it is 995. If you are rounding to the nearest hundred, it is 950. If you are rounding to the nearest thousand, it is 500.
-- If rounding to the nearest 300, then 749 is. -- If rounding to the nearest 200, then 699 is. -- If rounding to the nearest 150, then 674 is. -- If rounding to the nearest hundred, then 649 is. -- If rounding to the nearest twenty, then 609 is. -- If rounding to the nearest ten, then 604 is.
213
It depends what you're rounding off to. 9050 rounded to the nearest hundred, would be 9100. 9050 rounded to the nearest thousand, would be 9000.
If you are rounding it off to the nearest tens place its 320 if its to the nearest hundred than its 300
27222722
The answer will depend on the degree of rounding: to the nearest ten, hundred, thousand?
Im not sure actually. But rounding to the nearest number means its easier probably.
Any number greater than or equal to 745 but less than 755 can be rounded off to 750. This is because when rounding to the nearest 10, any number 5 or greater in the ones place will round up. Therefore, numbers like 746, 747, 748, 749, and 750 can all be rounded to 750.