1
Chat with our AI personalities
No, not always since: if a number is more than 1, then its square root is smaller than the number. if a number is less than 1, then its square root is bigger than the number.
It is if the number is more than ' 1 '. If the number is less than ' 1 ', then it's smaller than its own square root.
A square root is simplified when: -The radicand has no perfect square factors other than 1 -The radicand has no fractions -There are no square roots in the denominator *Radicand: the number and/or variables underneath the square root sign
The square root of the number acts sort of like the 'multiplicative middle' of the number. When you multiple the square root by itself, you get back to original number. When you search for factors of the number (easier to see when thinking of real numbers) it's intuitive that you must have one number on either side of this 'multiplicative middle'
Since the square root of a number is the "number times itself that equals the original number," it makes sense that the larger the original number, then the larger the square root. The value of the square root of 2 will be greater than the value of the square root of 1.5.