We have ten fingers (including thumbs) and early counting is based on one-to-one mapping onto these digits. So one reason is simple familiarity. The other advantage of counting in decimals is that fewer digits are required: 4 decimal digits takes you to over a thousand, you would need 10 binary digits to go over 1024. It gets worse with larger numbers: 7 decimal digits to go over a million but 20 binary digit. I have phones with 11 digit numbers (without the international country code). In binary, that would be a 33-digit number. No thanks!
If the two decimal numbers have x and y digits after the decimal points, then the product has (x + y) digits after the decimal point.
-- The decimal system (base-10) uses 10 digits to write all numbers. -- The binary system (base-2) uses 2 digits to write all numbers.
To write out a certain number of millions, move the decimal point six positions to the right. If there are not enough digits, fill out with zeros.
the answer is it stays in the same place.* * * * *Not quite.Suppose you want to multiply two decimal numbers A and B. Multiply the two numbers ignoring the decimal points.Count the number of digits after the decimal point in the number A.Count the number of digits after the decimal point in the number B.Add these two numbers together. This is the number of digits you want after the decimal point in the answer. So count back from the end.Example:2.54 * 3.5 (this is number of centimetres in 3.5 inches)254*35 = 8890Number of digits after the decimal point in 2.54 is 2 (5 and 4).Number of digits after the decimal point in 3.5 is 1 (5).2 + 1 = 3 so there must be 3 digits after the decimal point in the answer.Therefore 8890 becomes 8.890NOW, you can simplify it to 8.89
1 million < 165 so 6 digits would be enough.
Seven will be more than enough.
The answer depends on the degree of precision. If only integers are to be represented, then 6 digits would be enough because 165 = 1,048,576 is bigger than a million.
Oh, that's a wonderful question! To represent decimal numbers up to 1 million in hexadecimal, you would need about 4 hex digits. Hexadecimal is base 16, so each digit can represent 16 different values (0-9 and A-F), making it an efficient way to represent large numbers in a compact form. Just imagine those beautiful hex digits coming together like little puzzle pieces to create the number 1 million - what a happy little number!
Eliminate the decimal point, and make sure that there are six more digits after the "2". Fill out missing digits with 0.
It displays numbers with more digits after the decimal point.
We have ten fingers (including thumbs) and early counting is based on one-to-one mapping onto these digits. So one reason is simple familiarity. The other advantage of counting in decimals is that fewer digits are required: 4 decimal digits takes you to over a thousand, you would need 10 binary digits to go over 1024. It gets worse with larger numbers: 7 decimal digits to go over a million but 20 binary digit. I have phones with 11 digit numbers (without the international country code). In binary, that would be a 33-digit number. No thanks!
If two decimal numbers have x and y digits after the decimal point respectively, then their product has (x + y) digits after the decimal point.
If the two decimal numbers have x and y digits after the decimal points, then the product has (x + y) digits after the decimal point.
All of them. We normally count in decimal numbers and therefore all digits in decimal numbers must be less than ten.
With base one million, you can create one nonillion different numbers; using the traditional decimal (base ten) system, you can form 100 thousand different numbers.
The two digits after the decimal point.