If the number has at most one digit after the decimal point, you do nothing.
If it has more than one non-zero digit after the decimal point, you look at the second digit after the decimal. If it is 1, 2, 3 or 4, you delete it and subsequent digits. If it is 6, 7, 8 or 9 you add one to the first digit after the decimal point and delete the rest. If it is 5, you round the first digit to the nearest even digit (so you may or may not do anything), and delete from the second digit.
Some people, ignorant of bias and its effect, require the digit 5 to be rounded up in all cases. This introduces an upward bias in calculations for the following reason:
0 is not rounded.
The digits 1, 2, 3 and 4 are rounded down
The digits 5, 6, 7, 8 and 9 are rounded up.
In the long term, therefore, there would by fewer incidences of rounding down compared to rounding up. As a simple example, the mean of the first 20 integers, calculated from their rounded values is 11, not 10.5 which is the correct value.
To avoid this bias, 5 MUST be rounded down half the time and up half the time. The conventional way to do this while maintaining reproducibility of calculations is to round 5 so that the last surviving digit is even.
Chat with our AI personalities
6.5- you round it until it has one decimal.
2.21 rounded to one decimal place = 2.2
3.91 rounded off to one decimal place = 3.9
To round 34.4 to one decimal place, you look at the digit in the second decimal place, which is 4. Since 4 is less than 5, you simply drop all the digits after the first decimal place, resulting in 34.4 rounded to one decimal place.
Well, isn't that just a happy little question! To round 11.3 to 1 decimal place, you simply look at the digit right after the decimal point. Since it's a 3, which is less than 5, you keep the 1 before the decimal point the same and drop everything after it, resulting in 11.3 rounded to 1 decimal place. Just remember, there are no mistakes in rounding, only happy little accidents!