How are addition and multiplication different?
for natural numbers, multiplication can be thought of as
shorthand for a particular way that we want to do addition. once we
define addition, we come across the need to define the addition of
a single number x to itself a certain number of times, say, y
times. so the question is, how do you write that? we could always
just write
x+x+x+x+x+x+x+ . . . , and make sure we've included enough
a's
or even
x+ . . . +x (y times) [simply stating that this happens y
times]
but writing it either way each time would produce algebra
textbooks of biblical proportions.
so, we use the shorthand x*y and understand it to mean "add x to
itself y times" (or equivalently, add y to itself x times).
when we allow for x and y to be any real numbers, then this
interpretation is valid only if you are comfortable accepting the
notion of doing something pi number of times, or even 1/3 number of
times for that matter (the latter being easier to swallow than the
former). here, multiplication is just another classic
abstractification of an intuitive concept.
OR
as binary operators [a(x,y)= x+y, m(x,y)= x*y], they are clearly
different since for most values of x and y, a(x,y) =/= m(x,y) (
except whenever x = y/(y-1) ).