Assume 3 is a perfect square for a rational number, a/b where a and b are integers in lowest terms.
Now means 3=a2 /b2 which implies the square root of 3=a/b but the square root of 3 is irrational so this is not possible.
But we need one more part to show that.
Suppose
3 = p2/q2
where p and q are integers and p/q is in lowest terms. So
3 q2 = p2 as we did above with a and b.
This means p^2 is divisible by 3. That means that p must be as well, so p^2 is
divisible by nine.
So
q2 = p2/3
and q^2 is divisible by three.
But that means that p and q are both divisible by three, so they weren't
in lowest terms, which is a contradiction because we said they were.
Search for the proof for the irrationality of the square root of 2. The same reasoning applies to any positive integer that is not a perfect square. In summary, the square root of any positive integer is either a whole number, or - as in this case - it is irrational.
Most high school algebra books show a proof (by contradiction) that the square root of 2 is irrational. The same proof can easily be adapted to the square root of any positive integer, that is not a perfect square. You can find the proof (for the square root of 2) on the Wikipedia article on "irrational number", near the beginning of the page (under "History").
sqrt(2) is irrational. 3 is rational. The product of an irrational and a non-zero rational is irrational. A more fundamental proof would follow the lines of the proof that sqrt(2) is irrational.
This can easily be proved by contradiction. Without loss of generality, I will take specific numbers as an example. The proof can easily be extended to any rational + irrational number. Assumption: 1 plus the square root of 2 is rational. (It is a well-known fact that the square root of 2 is irrational. No need to prove it here; you can use any other irrational number will do.) This rational sum can be written as p / q, where "p" and "q" are whole numbers (this is basically the definition of a "rational number"). Then, the square root of 2, which is equal to the sum minus 1, is: p / q - 1 = p / q - q / q = (p - q) / q Since the difference of two whole numbers is a whole number, this makes the square root of 2 rational, which doesn't make sense.
I am not quite sure what you mean with "derive" - what sort of derivation you will accept. If you take the square root of an integer, unless the integer happens to be a perfect square, you get an irrational number. And yes, there is proof of that. The can be found in most high school algebra books.
The square root of a positive integer can ONLY be:* Either an integer, * Or an irrational number. (The proof of this is basically the same as the proof, in high school algebra books, that the square root of 2 is irrational.) Since in this case 32 is not the square of an integer, it therefore follows that its square root is an irrational number.
Search for the proof for the irrationality of the square root of 2. The same reasoning applies to any positive integer that is not a perfect square. In summary, the square root of any positive integer is either a whole number, or - as in this case - it is irrational.
No, the square root of 3 is not rational.No. The square root of 3 is irrational.More generally: if p is a prime number then the square root of p is irrational and the proof of this fact mimics the famous proof of irrationality of the square root of 2.No - the square root of 3 is not rational, but the proof is too involved to post here.
By an indirect proof. Assuming the square root is rational, it can be written as a fraction a/b, with integer numerator and denominator (this is basically the definition of "rational"). If you square this, you get a2/b2, which is rational. Hence, the assumption that the square root is rational is false.
Consider a rational number, p.p is rational so p = x/y where x and y are integers.x is an integer so x*x is an integer, and y is an integer so y*y is an integer.So p2 = (x/y)2 = x2/y2 is a ratio of two integers and so is rational.
Most high school algebra books show a proof (by contradiction) that the square root of 2 is irrational. The same proof can easily be adapted to the square root of any positive integer, that is not a perfect square. You can find the proof (for the square root of 2) on the Wikipedia article on "irrational number", near the beginning of the page (under "History").
sqrt(2) is irrational. 3 is rational. The product of an irrational and a non-zero rational is irrational. A more fundamental proof would follow the lines of the proof that sqrt(2) is irrational.
Yes. Google Cauchy's proof.
This can easily be proved by contradiction. Without loss of generality, I will take specific numbers as an example. The proof can easily be extended to any rational + irrational number. Assumption: 1 plus the square root of 2 is rational. (It is a well-known fact that the square root of 2 is irrational. No need to prove it here; you can use any other irrational number will do.) This rational sum can be written as p / q, where "p" and "q" are whole numbers (this is basically the definition of a "rational number"). Then, the square root of 2, which is equal to the sum minus 1, is: p / q - 1 = p / q - q / q = (p - q) / q Since the difference of two whole numbers is a whole number, this makes the square root of 2 rational, which doesn't make sense.
The view that no rational proof is necessary for belief is called faith.
The argument why the square root of 2 is irrational can be found in most high school algebra books. You can also find this proof, and several other proofs, that the square root of 2 is irrational, in the Wikipedia article "Square root of 2".The same argument can be applied to the square root of any natural number that is not a perfect square.
an irrational number PROOF : Let x be any rational number and y be any irrational number. let us assume that their sum is rational which is ( z ) x + y = z if x is a rational number then ( -x ) will also be a rational number. Therefore, x + y + (-x) = a rational number this implies that y is also rational BUT HERE IS THE CONTRADICTION as we assumed y an irrational number. Hence, our assumption is wrong. This states that x + y is not rational. HENCE PROVEDit will always be irrational.