The square root of 0 is 0
What number multiplied by itself would equal zero? ----> 0
more info....
Remember that a square root can also be written as "raise to the one-half power":
sqrt(x) = x^(1/2)
0 raised to any non-negative power is 0, so yes, it is defined, and it is 0.
0 cannot be raised to *negative* powers, however, because this would place the 0 in the denominator:
x^(-1) = 1/x
If you let x=0, you'll quickly see that you end up with 1/0, and division by 0 is not allowed in mathematics. (There are still ways to get answers out of some problems that require dividing by zero, and you'll learn about this when you learn about taking limits.)
By the way, this illegal division by zero is often used in the following false proof, which you can use to confuse your friends:
Let: x = 1
If x = 1, then: x^2 - 1 = x - 1
Factor the left side: (x + 1)*(x - 1) = x - 1
Divide both sides by (x-1): x + 1 = (x - 1)/(x - 1)
Simplify: x + 1 = 1
Simplify: x = 0
Remember initial definition of x=1, so we've proved: 1 = 0
Obviously, there's a flaw in this proof. Can you find it?
The problem is when we divide both sides by (x-1)...if x=1, at that point in the proof we're dividing by 0, which is illegal. In the next step, we have (x-1)/(x-1) evaluate to 1, when in fact, it's undefined.
At the college level, whenever you're working a proof that requires a division on both sides of an equation with a variable involved, at that point you have to note the value of the variable that is not allowed. In the above proof, at the step we divide both sides by (x-1), you'd have to note in the margin: "x≠1; since we defined x=1 in the first step, the proof can no longer continue." If we had not defined x=1 at the beginning, you'd have to note at that step: "x≠1".
0
The square root of 1 is 1.The square root of 0 is 0.
The square root of both 0 and 1 equals the square of 0 and 1
The answer is 0.
quadratics have the form ax2+bx+c=0 the discriminant is the square root of (b2-4ac) = square root of (16-16) =square root of 0 = 0
Any number greater than 0 has two square roots, a positive square root and a corresponding negative square root. Rounded to two decimal places, the square roots of 134 are ±11.58.
The square root of 1 is 1.The square root of 0 is 0.
The square root of 0 would be 0 because 0 * 0 = 0
Sqrt(0) = 0 so the answer is 0.
The square root of 0 is 0, which is a real number.
The square root of 0 is 0. Since 0 has no positive or negative equivalent, this is its only square root.
If x is 0, the square root is 0 also.
The square root of both 0 and 1 equals the square of 0 and 1
0
The answer is 0.
sqrt(a)+sqrt(b) is different from sqrt(a+b) unless a=0 and/or b=0. *sqrt=square root of
0
quadratics have the form ax2+bx+c=0 the discriminant is the square root of (b2-4ac) = square root of (16-16) =square root of 0 = 0