Best Answer

because you can always add a 0 when using decimals

Q: Why is there no such thing as a remainder when dividing decimals?

Write your answer...

Submit

Still have questions?

Continue Learning about Other Math

By the time you advance to the point of dividing decimals, you don't use remainders any more.

in dividing decimals you never get a remainder and in dividing whole numbers you do. +++ More to the point perhaps, you are working in powers of 10 all the time.

A g

you have to divide the remainder by the number you are dividing

Yes

Related questions

you put that number as your remainder

By the time you advance to the point of dividing decimals, you don't use remainders any more.

add a zero to the end (only if it'safter the decimal) and continue dividing

in dividing decimals you never get a remainder and in dividing whole numbers you do. +++ More to the point perhaps, you are working in powers of 10 all the time.

the Remainder is divided by the outside number to form another decimal, which is added onto the end of the answer.

False.

all rational fractions are repeating. When you divide, eventually the remainder will repeat and then will the sequence

Yes

9.4558

The largest remainder, when dividing by any integer, n is n-1. So, when dividing by 2, the largest remainder is 1.

If you are making use of long division method, the process of dividing a whole number is actually a subset of the process of dividing the decimals. While dividing both you may get a quotient with decimal places. Some exceptions to this do exist in case of whole numbers. Like when you are dividing 100 by 2, the quotient 50 has no decimal places.

A g