It must be x*(x+1).
To see this, suppose that there existed a smaller common multiple formed by taking
a*x and b*(x+1),
where a =/= b since multiplying by the same number won't give you a common multiple.
Then we have
a*x < x*(x+1) => a < (x+1)
b*(x+1) < x*(x+1) => b < x
=> a*b < x*(x+1).
Also, a*x = b*(x+1) => x = b/(a-b) & (x+1) = a/(a-b). Therefore
x*(x+1) = a*b/(a-b)^2 < x*(x+1)/(a-b)^2
=> (a-b)^2 < 1
=> (a-b) < 1.
The problem here is that this requires that a=b, which cannot be. Therefore, x*(x+1) is the smallest common multiple of both x and (x+1)
Chat with our AI personalities