answersLogoWhite

0

8 (assuming unsigned numbers - i.e., you don't reserve a bit for the sign).

User Avatar

Wiki User

10y ago

What else can I help you with?

Related Questions

How many binary bits are needed to represent decimal number 21?

5


How many bits are required to represent an eight digit decimal number inBCD?

To represent an eight-digit decimal number in Binary-Coded Decimal (BCD), each decimal digit is encoded using 4 bits. Since there are 8 digits in the number, the total number of bits required is 8 digits × 4 bits/digit = 32 bits. Therefore, 32 bits are needed to represent an eight-digit decimal number in BCD.


How many bits are needed to represent decimal 200?

8 bits if unsigned, 9 bits if signed


How many bits are needed to represent decimal value ranging from 0 to 12500?

how many bits are needed to represent decimal values ranging from 0 to 12,500?


What is the minimum number of bits required to represent the decimal 1000?

9 bits


how many bits are needed to represent decimal values ranging from 0 to 12,500?

1200


How many bits need to represent the decimal number 200?

8


What is the minimum number if binary bits required to represent the decimal number 250?

8


How many bits are required to represent 32 digit decimal number?

103


Four bytes can represtent a decimal number between 0 and?

Four bytes represent 32 bits. 32 bits represent 4,294,967,296 possibilities.


What is the minimum number of bits required to represent the following decimal number 101000?

17 bits would allow a value up to 131071.


How may bits are needed to represent the decimal number 200?

log2 200 = ln 200 ÷ ln 2 = 7.6... → need 8 bits. If a signed number is being stored, then 9 bits would be needed as one would be needed to indicate the sign of the number.