Division tells you how many times you can take the divisor away from the dividend to get zero.
For example, 18 ÷ 6 = 3 tells you you can take 6 away from 18 three times to get zero:
When you subtract zero, the dividend doesn't change, so no matter how many times you take zero away, you will never reach zero:
18 - 0 = 18
18 - 0 = 18
18 - 0 = 18
...
18 - 0 = 18
...
Thus you cannot divide by zero.
Chat with our AI personalities
Let us try to divide 6 by 0
0x1=0
0x2=0
0x3=0
0x4=0
0x5=0
0x6=0
We have observed that whatsoever no is multiplied by zero the result is zero. Thus we will never get a perfect quotient as the dividend will always be the remainder after subtraction. So whenever a no. is divided by zero the result is infinity.
I would like to suggest another way of looking at this problem. Suppose that you were dividing
1 by a series of increasingly large numbers; 1/2, 1/3, 1/4, and so forth. These fractions get smaller and smaller. But they never get to zero. No matter how large the number by which 1 is divided, the result is still a positive number greater than zero. So zero is not just the integer that comes before the number 1, it also represents infinite smallness. You have to divide 1 (or any number) by infinity in order to get zero. Dividing by an infinitely small number is the equivalent of multiplying by an infinitely large number. And if you multiply by infinity, the result is also infinity. So, in that sense you can divide by zero and get infinity, but that is not a useful result. So we generally avoid doing that.