When you multiply a number by 1, you get the same number back. Ex. 10x1=10
0.0001 Multiply the above by 10, you get 0.001, now multiply 0.001 by 10 and you get 0.01, now multiply 0.01 by 10 and you get 0.1, now multiply 0.1 by 10 and you get 1 So to get from 0.0001 to 1, you had to multiply it by 10 4 times, that is you had to multiply 0.0001 by 10000 to get 1, but if you multiply a number by 10000, you have to divide it also by the same number to retain the original value Hence 0.0001 is the same as 1/10000
A fraction "of" a number is the same as the fraction "times" a number. In other words, you must multiply 1/10 times 80.A fraction "of" a number is the same as the fraction "times" a number. In other words, you must multiply 1/10 times 80.A fraction "of" a number is the same as the fraction "times" a number. In other words, you must multiply 1/10 times 80.A fraction "of" a number is the same as the fraction "times" a number. In other words, you must multiply 1/10 times 80.
* Multiply by 10 (by addind a zero to the right of the number), * then double: Example - multiply 52 by 20: 52 x 10 = 520 double 520 = 1040
The only pair of integers are 1 and 61. But you could multiply 10 and 6.1 or an infinite number of pairs of numbers.
To find ten percent of a number, multiply the number by 1/10, or .1 You can also divide it by ten.
When you multiply a number by 1, you get the same number back. Ex. 10x1=10
Multiply the number by 0.1
Multiply the number by 1.1
multiply the number by .1, that is the decimal form of 10%
Use iterative adds. 5x2= 5+5 10x4= 10+10+10+10
Multiply that number by 0.1
0.0001 Multiply the above by 10, you get 0.001, now multiply 0.001 by 10 and you get 0.01, now multiply 0.01 by 10 and you get 0.1, now multiply 0.1 by 10 and you get 1 So to get from 0.0001 to 1, you had to multiply it by 10 4 times, that is you had to multiply 0.0001 by 10000 to get 1, but if you multiply a number by 10000, you have to divide it also by the same number to retain the original value Hence 0.0001 is the same as 1/10000
Yes. 10 x 10 = 102.
To convert centimeters to millimeters, multiply the number of centimeters by 10. This is because there are 10 millimeters in 1 centimeter.
Multiply the number by 0.1.
Multiply your number by 1.1