answersLogoWhite

0


Best Answer

21 bits.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many bits are required to count up to decimal 1 million?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How many binary bits are required to represent the decimal number 643?

Count them: 643(10)=1010000011(2)


What is the minimum number of bits required to represent the decimal 1000?

9 bits


How many bits are required to represent 32 digit decimal number?

103


How many binary digits are required to count to decimal 15?

Assuming you start from 0, you need at least 4 bits. 15 in binary: 15 = 8 + 4 + 2 + 1 = 1111₂


What is the minimum number if binary bits required to represent the decimal number 250?

8


What is the minimum number of bits required to represent the following decimal number 101000?

17 bits would allow a value up to 131071.


How many bits are needed to represent decimal value ranging from 0 to 12500?

how many bits are needed to represent decimal values ranging from 0 to 12,500?


How many bits are required in decimal numbers in range 0 to 999 using Straight binary code and BCD code?

10 bits would be required. 10 bits long (10 digits long) can represent up to 1024.


What is the maximum number that you can count up to using 10 bits14 bits and count up to a maximum of 511 63?

Using n bits, you can count to 2n - 1. This is for unsigned integers. So 10 bits = 210 - 1 = 1023 14 bits = 214 - 1 = 16383 To count to 511 you need log2(511+1) = log2(512) = 9 bits. To count to 63 you need log2(63+1) = log2(64) = 6 bits.


How many decimal numbers can be represented by 4-bits?

There are 16 decimal numbers that can be represented by 4-bits.


How many bits are needed to represent decimal 200?

8 bits if unsigned, 9 bits if signed


What is the number that you can count up to using 10 bits?

the highest number you can count up to using 10 bits is 1029 using binary