because you can always add a 0 when using decimals
By the time you advance to the point of dividing decimals, you don't use remainders any more.
add a zero to the end (only if it'safter the decimal) and continue dividing
in dividing decimals you never get a remainder and in dividing whole numbers you do. +++ More to the point perhaps, you are working in powers of 10 all the time.
the Remainder is divided by the outside number to form another decimal, which is added onto the end of the answer.
all rational fractions are repeating. When you divide, eventually the remainder will repeat and then will the sequence
The largest remainder, when dividing by any integer, n is n-1. So, when dividing by 2, the largest remainder is 1.
The answer depends on what you are dividing by 11.
If you are making use of long division method, the process of dividing a whole number is actually a subset of the process of dividing the decimals. While dividing both you may get a quotient with decimal places. Some exceptions to this do exist in case of whole numbers. Like when you are dividing 100 by 2, the quotient 50 has no decimal places.