Assume it's rational. Then 2 + root2 = some rational number q. Then root2 = q - 2. However, the rational numbers are well-defined under addition by (a,b) + (c,d) = (ad + bc, bd) (in other words, you can add two fractions a/b and c/d and always get another fraction of the form (ad + bc)/bd.) Therefore, q - 2 = q + (-2) is rational, since both q and -2 are rational. This implies root2 must be rational, which is a contradiction. Therefore the assumption that 2 + root2 is rational must be false.
Chat with our AI personalities
There cannot be a proof since your assertion is not necessarily true. sqrt(2)*sqrt(3) = sqrt(6). All three are irrational numbers.
It cannot. It can only show square roots which represent only a small proportion of irrational numbers.
It is an irrational number and rounds to 5.66 to the nearest hundredths
Root signs didn't show up
Certainly. Otherwise, there would be a rational number whose square was an irrational number; that is not possible. To show this, let p/q be any rational number, where p and q are integers. Then, the square of p/q is (p^2)/(q^2). Since p^2 and q^2 must both be integers, their quotient is, by definition, a rational number. Thus, the square of every rational number is itself rational.