it is realy easy
Chat with our AI personalities
Check
A decimal number is simply a way of representing a number in such a way that the place value of each digit is ten times that of the digit to its right. It may or may not contain a fractional part. If not, the decimal representation does not require a decimal point. So the decimal for 1 mega is 1 mega. If you want it as a decimal fraction of some other quantity then you need to specify that second quantity.
Mega officially means a million.
In computers, giga is 1024 (2^10) times bigger than mega. In normal metric usage, giga is 1000 (10^3) times bigger than mega. This is due to the fact that computers use binary internally
no, but I don't know why. One is not a prime number and by definition of a perfect number one must be a prime number.