When you divide a whole number by a decimal less than 1 the quot ion is greater than the whole number why?
Consider the whole number 1 and let's say we divide it by
0.1
What is meant by the above statement is how many 0.1 are there
in 1
That is (1)/(0.1) and can be written as (1)/(1/10) and asks how
many 1/10th are there in 1. We see 1 is a whole number divided by a
fraction which will always yield a larger quotient than the
original number.